home / github

Menu
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

7,983 rows sorted by reactions descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: author_association, reactions, created_at (date), updated_at (date)

id html_url issue_url node_id user created_at updated_at author_association body reactions ▲ issue performed_via_github_app
712585687 https://github.com/simonw/datasette/issues/782#issuecomment-712585687 https://api.github.com/repos/simonw/datasette/issues/782 MDEyOklzc3VlQ29tbWVudDcxMjU4NTY4Nw== simonw 9599 2020-10-20T04:47:02Z 2020-10-20T04:47:12Z OWNER Great point about CORS, I hadn't considered that. I think I'm going to keep the `Link:` header (added in #1014) because I quite enjoy using it with GitHub and WordPress, but I'm not going to have it be the default way of doing pagination. For the default shape I'm now leaning towards this: ```json { "total": 36, "rows": [{"id": 1, "name": "Cleo"}], "next_url": "https://latest-with-plugins.datasette.io/fixtures/facetable.json?_next=5" } ``` So three keys: `total`, `rows` and `next_url`. Then extra keys can be added using `?_extra=` with various named bundles. {"total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Redesign default .json format 627794879  
791530093 https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-791530093 https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 MDEyOklzc3VlQ29tbWVudDc5MTUzMDA5Mw== UtahDave 306240 2021-03-05T16:28:07Z 2021-03-05T16:28:07Z NONE > I just tried to run this on a small VPS instance with 2GB of memory and it crashed out of memory while processing a 12GB mbox from Takeout. > > Is it possible to stream the emails to sqlite instead of loading it all into memory and upserting at once? @maxhawkins a limitation of the python mbox module is it loads the entire mbox into memory. I did find another approach to this problem that didn't use the builtin python mbox module and created a generator so that it didn't have to load the whole mbox into memory. I was hoping to use standard library modules, but this might be a good reason to investigate that approach a bit more. My worry is making sure a custom processor handles all the ins and outs of the mbox format correctly. Hm. As I'm writing this, I thought of something. I think I can parse each message one at a time, and then use an mbox function to load each message using the python mbox module. That way the mbox module can still deal with the specifics of the mbox format, but I can use a generator. I'll give that a try. Thanks for the feedback @maxhawkins and @simonw. I'll give that a try. @simonw can we hold off on merging this until I can test this new approach? {"total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} WIP: Add Gmail takeout mbox import 813880401  
1011855133 https://github.com/simonw/sqlite-utils/issues/348#issuecomment-1011855133 https://api.github.com/repos/simonw/sqlite-utils/issues/348 IC_kwDOCGYnMM48T68d simonw 9599 2022-01-13T07:06:59Z 2022-01-13T07:06:59Z OWNER Wrote a lot more about this feature here: https://simonwillison.net/2022/Jan/12/how-i-build-a-feature/ {"total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Command for creating an empty database 1067771698  
1032987901 https://github.com/simonw/sqlite-utils/issues/403#issuecomment-1032987901 https://api.github.com/repos/simonw/sqlite-utils/issues/403 IC_kwDOCGYnMM49kiT9 simonw 9599 2022-02-08T19:36:06Z 2022-02-08T19:36:06Z OWNER New documentation: https://sqlite-utils.datasette.io/en/latest/cli.html#adding-a-primary-key-to-a-rowid-table {"total_count": 3, "+1": 3, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Document how to add a primary key to a rowid table using `sqlite-utils transform --pk` 1126692066  
344440377 https://github.com/simonw/datasette/issues/93#issuecomment-344440377 https://api.github.com/repos/simonw/datasette/issues/93 MDEyOklzc3VlQ29tbWVudDM0NDQ0MDM3Nw== simonw 9599 2017-11-14T23:56:35Z 2017-11-14T23:56:35Z OWNER It worked! $ pyinstaller -F \ --add-data /usr/local/lib/python3.5/site-packages/datasette/templates:datasette/templates \ --add-data /usr/local/lib/python3.5/site-packages/datasette/static:datasette/static \ /usr/local/bin/datasette $ file dist/datasette dist/datasette: Mach-O 64-bit executable x86_64 $ dist/datasette --help Usage: datasette [OPTIONS] COMMAND [ARGS]... Datasette! Options: --help Show this message and exit. Commands: serve* Serve up specified SQLite database files with... build package Package specified SQLite files into a new... publish Publish specified SQLite database files to... {"total_count": 3, "+1": 0, "-1": 0, "laugh": 0, "hooray": 3, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Package as standalone binary 273944952  
473312514 https://github.com/simonw/datasette/issues/417#issuecomment-473312514 https://api.github.com/repos/simonw/datasette/issues/417 MDEyOklzc3VlQ29tbWVudDQ3MzMxMjUxNA== simonw 9599 2019-03-15T14:42:07Z 2019-03-17T22:12:30Z OWNER A neat ability of Datasette Library would be if it can work against other files that have been dropped into the folder. In particular: if a user drops a CSV file into the folder, how about automatically converting that CSV file to SQLite using [sqlite-utils](https://github.com/simonw/sqlite-utils)? {"total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Datasette Library 421546944  
580028669 https://github.com/simonw/datasette/issues/662#issuecomment-580028669 https://api.github.com/repos/simonw/datasette/issues/662 MDEyOklzc3VlQ29tbWVudDU4MDAyODY2OQ== simonw 9599 2020-01-30T00:30:19Z 2020-01-30T00:30:19Z OWNER I just shipped 0.34: https://datasette.readthedocs.io/en/stable/changelog.html#v0-34 {"total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Escape_fts5_query-hookimplementation does not work with queries to standard tables 556814876  
586729798 https://github.com/simonw/sqlite-utils/issues/86#issuecomment-586729798 https://api.github.com/repos/simonw/sqlite-utils/issues/86 MDEyOklzc3VlQ29tbWVudDU4NjcyOTc5OA== simonw 9599 2020-02-16T17:11:02Z 2020-02-16T17:11:02Z OWNER I filed a bug in the Python issue tracker here: https://bugs.python.org/issue39652 {"total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Problem with square bracket in CSV column name 564579430  
615932007 https://github.com/dogsheep/dogsheep-photos/issues/4#issuecomment-615932007 https://api.github.com/repos/dogsheep/dogsheep-photos/issues/4 MDEyOklzc3VlQ29tbWVudDYxNTkzMjAwNw== simonw 9599 2020-04-18T19:27:55Z 2020-04-18T19:27:55Z MEMBER Research thread: https://twitter.com/simonw/status/1249049694984011776 > I want to build some software that lets people store their own data in their own S3 bucket, but if possible I'd like not to have to teach people the incantations needed to get their bucket setup and minimum-permission credentials figures out https://testdriven.io/blog/storing-django-static-and-media-files-on-amazon-s3/ looks useful {"total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Upload all my photos to a secure S3 bucket 602533539  
751504136 https://github.com/simonw/datasette/issues/417#issuecomment-751504136 https://api.github.com/repos/simonw/datasette/issues/417 MDEyOklzc3VlQ29tbWVudDc1MTUwNDEzNg== drewda 212369 2020-12-27T19:02:06Z 2020-12-27T19:02:06Z NONE Very much looking forward to seeing this functionality come together. This is probably out-of-scope for an initial release, but in the future it could be useful to also think of how to run this is a container'ized context. For example, an immutable datasette container that points to an S3 bucket of SQLite DBs or CSVs. Or an immutable datasette container pointing to a NFS volume elsewhere on a Kubernetes cluster. {"total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Datasette Library 421546944  
755133937 https://github.com/simonw/datasette/issues/1101#issuecomment-755133937 https://api.github.com/repos/simonw/datasette/issues/1101 MDEyOklzc3VlQ29tbWVudDc1NTEzMzkzNw== simonw 9599 2021-01-06T07:25:48Z 2021-01-06T07:26:43Z OWNER Idea: instead of returning a dictionary, `register_output_renderer` could return an object. The object could have the following properties: - `.extension` - the extension to use - `.can_render(...)` - says if it can render this - `.can_stream(...)` - says if streaming is supported - `async .stream_rows(rows_iterator, send)` - method that loops through all rows and uses `send` to send them to the response in the correct format I can then deprecate the existing `dict` return type for 1.0. {"total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} register_output_renderer() should support streaming data 749283032  
782765665 https://github.com/simonw/datasette/issues/782#issuecomment-782765665 https://api.github.com/repos/simonw/datasette/issues/782 MDEyOklzc3VlQ29tbWVudDc4Mjc2NTY2NQ== simonw 9599 2021-02-20T23:34:41Z 2021-02-20T23:34:41Z OWNER OK, I'm back to the "top level object as the default" side of things now - it's pretty much unanimous at this point, and it's certainly true that it's not a decision you'll even regret. {"total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Redesign default .json format 627794879  
802099264 https://github.com/simonw/datasette/issues/1262#issuecomment-802099264 https://api.github.com/repos/simonw/datasette/issues/1262 MDEyOklzc3VlQ29tbWVudDgwMjA5OTI2NA== simonw 9599 2021-03-18T16:43:09Z 2021-03-18T16:43:09Z OWNER I often find myself wanting this too, when I'm exploring a new dataset. i agree with Bob that this is a good candidate for a plugin. The plugin system isn't quite setup for this yet though - there isn't an obvious mechanism for adding extra sort orders or other interface elements that manipulate the query used by the table view in some way. I'm going to promote this issue to status of a plugin hook feature request - I have a hunch that a plugin hook that enables `order by random()` could enable a lot of other useful plugin features too. {"total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Plugin hook that could support 'order by random()' for table view 834602299  
849022714 https://github.com/simonw/datasette/issues/670#issuecomment-849022714 https://api.github.com/repos/simonw/datasette/issues/670 MDEyOklzc3VlQ29tbWVudDg0OTAyMjcxNA== simonw 9599 2021-05-26T18:33:47Z 2021-05-26T18:33:58Z OWNER Worth mentioning here: I've been doing a tun of research around running Datasette-like functionality against PostgreSQL in my https://github.com/simonw/django-sql-dashboard project - which will definitely inform the Datasette implementation. {"total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Prototoype for Datasette on PostgreSQL 564833696  
893133496 https://github.com/simonw/datasette/issues/1419#issuecomment-893133496 https://api.github.com/repos/simonw/datasette/issues/1419 IC_kwDOBm6k_c41PCK4 simonw 9599 2021-08-05T03:22:44Z 2021-08-05T03:22:44Z OWNER I ran into this exact same problem today! I only just learned how to use filter on aggregates: https://til.simonwillison.net/sqlite/sqlite-aggregate-filter-clauses A workaround I used is to add this to the deploy command: datasette publish cloudrun ... --install=pysqlite3-binary This will install the https://pypi.org/project/pysqlite3-binary for package which bundles a more recent SQLite version. {"total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} `publish cloudrun` should deploy a more recent SQLite version 959710008  
1112889800 https://github.com/simonw/datasette/issues/1727#issuecomment-1112889800 https://api.github.com/repos/simonw/datasette/issues/1727 IC_kwDOBm6k_c5CVVnI simonw 9599 2022-04-29T05:29:38Z 2022-04-29T05:29:38Z OWNER OK, I just got the most incredible result with that! I started up a container running `bash` like this, from my `datasette` checkout. I'm mapping port 8005 on my laptop to port 8001 inside the container because laptop port 8001 was already doing something else: ``` docker run -it --rm --name my-running-script -p 8005:8001 -v "$PWD":/usr/src/myapp \ -w /usr/src/myapp nogil/python bash ``` Then in `bash` I ran the following commands to install Datasette and its dependencies: ``` pip install -e '.[test]' pip install datasette-pretty-traces # For debug tracing ``` Then I started Datasette against my `github.db` database (from github-to-sqlite.dogsheep.net/github.db) like this: ``` datasette github.db -h 0.0.0.0 --setting trace_debug 1 ``` I hit the following two URLs to compare the parallel v.s. not parallel implementations: - `http://127.0.0.1:8005/github/issues?_facet=milestone&_facet=repo&_trace=1&_size=10` - `http://127.0.0.1:8005/github/issues?_facet=milestone&_facet=repo&_trace=1&_size=10&_noparallel=1` And... the parallel one beat the non-parallel one decisively, on multiple page refreshes! Not parallel: 77ms Parallel: 47ms <img width="1213" alt="CleanShot 2022-04-28 at 22 10 54@2x" src="https://user-images.githubusercontent.com/9599/165889437-60d4200d-698a-4175-af23-7c03bb456e66.png"> <img width="1213" alt="CleanShot 2022-04-28 at 22 10 21@2x" src="https://user-images.githubusercontent.com/9599/165889445-2dfb8676-d823-405e-aecb-ad28ec3043da.png"> So yeah, I'm very confident this is a problem with the GIL. And I am absolutely **stunned** that @colesbury's fork ran Datasette (which has some reasonably tricky threading and async stuff going on) out of the box! {"total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Research: demonstrate if parallel SQL queries are worthwhile 1217759117  
579830682 https://github.com/simonw/datasette/issues/661#issuecomment-579830682 https://api.github.com/repos/simonw/datasette/issues/661 MDEyOklzc3VlQ29tbWVudDU3OTgzMDY4Mg== simonw 9599 2020-01-29T16:07:41Z 2020-01-29T16:07:41Z OWNER Having `datasette package` take an optional port argument seems like a good feature to me. {"total_count": 2, "+1": 1, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} --port option to expose a port other than 8001 in "datasette package" 555832585  
683178570 https://github.com/simonw/sqlite-utils/issues/139#issuecomment-683178570 https://api.github.com/repos/simonw/sqlite-utils/issues/139 MDEyOklzc3VlQ29tbWVudDY4MzE3ODU3MA== simonw 9599 2020-08-28T22:48:51Z 2020-08-28T22:48:51Z OWNER Thanks @simonwiles, this is now released in 2.16.1: https://sqlite-utils.readthedocs.io/en/stable/changelog.html {"total_count": 2, "+1": 1, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} insert_all(..., alter=True) should work for new columns introduced after the first 100 records 686978131  
746827083 https://github.com/simonw/datasette/issues/1143#issuecomment-746827083 https://api.github.com/repos/simonw/datasette/issues/1143 MDEyOklzc3VlQ29tbWVudDc0NjgyNzA4Mw== simonw 9599 2020-12-16T18:56:07Z 2020-12-16T18:56:07Z OWNER I think the right way to do this is to support multiple optional `--cors-origin=` pattern values, like you suggested. {"total_count": 2, "+1": 1, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} More flexible CORS support in core, to encourage good security practices 764059235  
705874457 https://github.com/simonw/datasette/issues/969#issuecomment-705874457 https://api.github.com/repos/simonw/datasette/issues/969 MDEyOklzc3VlQ29tbWVudDcwNTg3NDQ1Nw== simonw 9599 2020-10-08T23:27:30Z 2020-10-08T23:27:30Z OWNER For the moment I'm going to ship this as the `--tar=` option. Can consider detecting `gtar` in the future. {"total_count": 2, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 1, "eyes": 0} Add --tar option to "datasette publish heroku" 705057955  
514500253 https://github.com/dogsheep/healthkit-to-sqlite/issues/7#issuecomment-514500253 https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/7 MDEyOklzc3VlQ29tbWVudDUxNDUwMDI1Mw== simonw 9599 2019-07-24T06:34:28Z 2019-07-24T06:34:28Z MEMBER Clearing the root element each time saved even more: <img width="1128" alt="Screen Shot 2019-07-24 at 8 30 38 AM" src="https://user-images.githubusercontent.com/9599/61770555-d3932100-aded-11e9-8ffe-bebd682f94ed.png"> {"total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 2, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Script uses a lot of RAM 472097220  
622162835 https://github.com/dogsheep/github-to-sqlite/issues/34#issuecomment-622162835 https://api.github.com/repos/dogsheep/github-to-sqlite/issues/34 MDEyOklzc3VlQ29tbWVudDYyMjE2MjgzNQ== simonw 9599 2020-04-30T22:59:26Z 2020-04-30T22:59:26Z MEMBER Documentation: https://github.com/dogsheep/github-to-sqlite/blob/c9f48404481882e8b3af06f35e4801a80ac79ed6/README.md#scraping-dependents-for-a-repository {"total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 2, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Command for retrieving dependents for a repo 610408908  
356175667 https://github.com/simonw/datasette/issues/176#issuecomment-356175667 https://api.github.com/repos/simonw/datasette/issues/176 MDEyOklzc3VlQ29tbWVudDM1NjE3NTY2Nw== wulfmann 4313116 2018-01-09T04:19:03Z 2018-01-09T04:19:03Z NONE @yozlet Yes I think that I was confused when I posted my original comment. I see your main point now and am in agreement. {"total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 2, "rocket": 0, "eyes": 0} Add GraphQL endpoint 285168503  
548508237 https://github.com/simonw/datasette/issues/176#issuecomment-548508237 https://api.github.com/repos/simonw/datasette/issues/176 MDEyOklzc3VlQ29tbWVudDU0ODUwODIzNw== eads 634572 2019-10-31T18:25:44Z 2019-10-31T18:25:44Z NONE 👋 I'd be interested in building this out in Q1 or Q2 of 2020 if nobody has tackled it by then. I would love to integrate Datasette into @thechicagoreporter's practice, but we're also fully committed to GraphQL moving forward. {"total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 2, "rocket": 0, "eyes": 0} Add GraphQL endpoint 285168503  
742260116 https://github.com/simonw/datasette/issues/1134#issuecomment-742260116 https://api.github.com/repos/simonw/datasette/issues/1134 MDEyOklzc3VlQ29tbWVudDc0MjI2MDExNg== clausjuhl 2181410 2020-12-10T05:57:17Z 2020-12-10T05:57:17Z NONE Hi Simon Thank you for the quick fix! And glad you like our use of Datasette (launches 1. january 2021). It's a site that currently (more to come) makes all minutes and their annexes from Aarhus City Council and the major committees (1997-2019) available to the public. So we're putting Datasette to good use :) {"total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 2, "rocket": 0, "eyes": 0} "_searchmode=raw" throws an index out of range error when combined with "_search_COLUMN" 760312579  
762488336 https://github.com/simonw/datasette/issues/1175#issuecomment-762488336 https://api.github.com/repos/simonw/datasette/issues/1175 MDEyOklzc3VlQ29tbWVudDc2MjQ4ODMzNg== hannseman 758858 2021-01-18T21:59:28Z 2021-01-18T22:00:31Z NONE I encountered your issue when trying to find a solution and came up with the following, maybe it can help. ```python import logging.config from typing import Tuple import structlog import uvicorn from example.config import settings shared_processors: Tuple[structlog.types.Processor, ...] = ( structlog.contextvars.merge_contextvars, structlog.stdlib.add_logger_name, structlog.stdlib.add_log_level, structlog.processors.TimeStamper(fmt="iso"), ) logging_config = { "version": 1, "disable_existing_loggers": False, "formatters": { "json": { "()": structlog.stdlib.ProcessorFormatter, "processor": structlog.processors.JSONRenderer(), "foreign_pre_chain": shared_processors, }, "console": { "()": structlog.stdlib.ProcessorFormatter, "processor": structlog.dev.ConsoleRenderer(), "foreign_pre_chain": shared_processors, }, **uvicorn.config.LOGGING_CONFIG["formatters"], }, "handlers": { "default": { "level": "DEBUG", "class": "logging.StreamHandler", "formatter": "json" if not settings.debug else "console", }, "uvicorn.access": { "level": "INFO", "class": "logging.StreamHandler", "formatter": "access", }, "uvicorn.default": { "level": "INFO", "class": "logging.StreamHandler", "formatter": "default", }, }, "loggers": { "": {"handlers": ["default"], "level": "INFO"}, "uvicorn.error": { "handlers": ["default" if not settings.debug else "uvicorn.default"], "level": "INFO", "propagate": False, }, "uvicorn.access": { "handlers": ["default" if not settings.debug else "uvicorn.access"], "level": "INFO", "propagate": False, }, }, } def setup_l… {"total_count": 10, "+1": 6, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 4, "rocket": 0, "eyes": 0} Use structlog for logging 779156520  
344161226 https://github.com/simonw/datasette/issues/46#issuecomment-344161226 https://api.github.com/repos/simonw/datasette/issues/46 MDEyOklzc3VlQ29tbWVudDM0NDE2MTIyNg== simonw 9599 2017-11-14T06:41:21Z 2017-11-14T06:41:21Z OWNER Spatial extensions would be really useful too. https://www.gaia-gis.it/spatialite-2.1/SpatiaLite-manual.html {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468  
353424169 https://github.com/simonw/datasette/issues/175#issuecomment-353424169 https://api.github.com/repos/simonw/datasette/issues/175 MDEyOklzc3VlQ29tbWVudDM1MzQyNDE2OQ== simonw 9599 2017-12-21T18:33:55Z 2017-12-21T18:33:55Z OWNER Done - thanks for curating these: https://github.com/topics/automatic-api {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Add project topic "automatic-api" 282971961  
392602334 https://github.com/simonw/datasette/issues/97#issuecomment-392602334 https://api.github.com/repos/simonw/datasette/issues/97 MDEyOklzc3VlQ29tbWVudDM5MjYwMjMzNA== simonw 9599 2018-05-28T20:57:21Z 2018-05-28T20:57:21Z OWNER The `/.json` endpoint is more of an implementation detail of the homepage at this point. A better, documented ( http://datasette.readthedocs.io/en/stable/introspection.html#inspect ) endpoint for finding all of the databases and tables is https://parlgov.datasettes.com/-/inspect.json {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Link to JSON for the list of tables  274022950  
404565566 https://github.com/simonw/datasette/issues/339#issuecomment-404565566 https://api.github.com/repos/simonw/datasette/issues/339 MDEyOklzc3VlQ29tbWVudDQwNDU2NTU2Ng== simonw 9599 2018-07-12T16:08:42Z 2018-07-12T16:08:42Z OWNER I'm going to turn this into an issue about better supporting the above option. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way 340396247  
405971920 https://github.com/simonw/datasette/issues/308#issuecomment-405971920 https://api.github.com/repos/simonw/datasette/issues/308 MDEyOklzc3VlQ29tbWVudDQwNTk3MTkyMA== simonw 9599 2018-07-18T15:27:12Z 2018-07-18T15:27:12Z OWNER It looks like there are a few extra options we should support: https://devcenter.heroku.com/articles/heroku-cli-commands ``` -t, --team=team team to use --region=region specify region for the app to run in --space=space the private space to create the app in ``` Since these differ from the options for Zeit Now I think this means splitting up `datasette publish now` and `datasette publish Heroku` into separate subcommands. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Support extra Heroku apps:create options - region, space, team 330826972  
431867885 https://github.com/simonw/datasette/issues/176#issuecomment-431867885 https://api.github.com/repos/simonw/datasette/issues/176 MDEyOklzc3VlQ29tbWVudDQzMTg2Nzg4NQ== eads 634572 2018-10-22T15:24:57Z 2018-10-22T15:24:57Z NONE I'd like this as well. It would let me access Datasette-driven projects from GatsbyJS the same way I can access Postgres DBs via Hasura. While I don't see SQLite replacing Postgres for the 50m row datasets I sometimes have to work with, there's a whole class of smaller datasets that are great with Datasette but currently would find another option. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Add GraphQL endpoint 285168503  
435974786 https://github.com/simonw/datasette/issues/370#issuecomment-435974786 https://api.github.com/repos/simonw/datasette/issues/370 MDEyOklzc3VlQ29tbWVudDQzNTk3NDc4Ng== simonw 9599 2018-11-05T18:06:56Z 2018-11-05T18:06:56Z OWNER I've been thinking a bit about ways of using Jupyter Notebook more effectively with Datasette (thinks like a `publish_dataframes(df1, df2, df3)` function which publishes some Pandas dataframes and returns you a URL to a new hosted Datasette instance) but you're right, Jupyter Lab is potentially a much more interesting fit. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Integration with JupyterLab 377155320  
439421164 https://github.com/simonw/datasette/issues/120#issuecomment-439421164 https://api.github.com/repos/simonw/datasette/issues/120 MDEyOklzc3VlQ29tbWVudDQzOTQyMTE2NA== ad-si 36796532 2018-11-16T15:05:18Z 2018-11-16T15:05:18Z NONE This would be an awesome feature ❤️ {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Plugin that adds an authentication layer of some sort 275087397  
450964512 https://github.com/simonw/datasette/issues/391#issuecomment-450964512 https://api.github.com/repos/simonw/datasette/issues/391 MDEyOklzc3VlQ29tbWVudDQ1MDk2NDUxMg== simonw 9599 2019-01-02T19:45:12Z 2019-01-02T19:45:12Z OWNER Thanks, I've fixed this. I had to re-alias it against now: ``` ~ $ now alias google-trends-pnwhfwvgqf.now.sh https://google-trends.datasettes.com/ > Assigning alias google-trends.datasettes.com to deployment google-trends-pnwhfwvgqf.now.sh > Certificate for google-trends.datasettes.com (cert_uXaADIuNooHS3tZ) created [18s] > Success! google-trends.datasettes.com now points to google-trends-pnwhfwvgqf.now.sh [20s] ``` {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Google Trends example doesn’t work 392610803  
453330680 https://github.com/simonw/datasette/issues/397#issuecomment-453330680 https://api.github.com/repos/simonw/datasette/issues/397 MDEyOklzc3VlQ29tbWVudDQ1MzMzMDY4MA== simonw 9599 2019-01-11T01:17:11Z 2019-01-11T01:25:33Z OWNER If you pull [the latest image](https://hub.docker.com/r/datasetteproject/datasette) you should get the right SQLite version now: docker pull datasetteproject/datasette docker run -p 8001:8001 \ datasetteproject/datasette \ datasette -p 8001 -h 0.0.0.0 http://0.0.0.0:8001/-/versions now gives me: ``` "version": "3.26.0" ``` {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Update official datasetteproject/datasette Docker container to SQLite 3.26.0 397129564  
467264937 https://github.com/simonw/datasette/issues/187#issuecomment-467264937 https://api.github.com/repos/simonw/datasette/issues/187 MDEyOklzc3VlQ29tbWVudDQ2NzI2NDkzNw== simonw 9599 2019-02-26T02:14:28Z 2019-02-26T02:14:28Z OWNER I'm working on a port of Datasette to Starlette which I think would fix this issue: https://github.com/encode/starlette {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Windows installation error 309033998  
473708941 https://github.com/simonw/datasette/issues/419#issuecomment-473708941 https://api.github.com/repos/simonw/datasette/issues/419 MDEyOklzc3VlQ29tbWVudDQ3MzcwODk0MQ== simonw 9599 2019-03-17T19:58:11Z 2019-03-17T19:58:11Z OWNER Some problems to solve: * Right now Datasette assumes it can always show the count of rows in a table, because this has been pre-calculated. If a database is mutable the pre-calculation trick no longer works, and for giant tables a `select count(*) from X` query can be expensive to run. Maybe we set a time limit on these? If time limit expires show "many rows"? * Maintaining a content hash of the table no longer makes sense if it is changing (though interestingly there's a `.sha3sum` built-in SQLite CLI command which takes a hash of the content and stays the same even through vacuum runs). Without that we need a different mechanism for calculating table colours. It also means that we can't do the special dbname-hash URL trick (see #418) at all if the database is opened as mutable. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Default to opening files in mutable mode, special option for immutable files 421551434  
488555399 https://github.com/simonw/datasette/issues/431#issuecomment-488555399 https://api.github.com/repos/simonw/datasette/issues/431 MDEyOklzc3VlQ29tbWVudDQ4ODU1NTM5OQ== simonw 9599 2019-05-02T05:13:54Z 2019-05-02T05:13:54Z OWNER Datasette master now treats databases as readonly but NOT immutable. This means you can make changes to those databases from another process and those changes will be instantly reflected in the Datasette interface. As such, reloading on database change is no longer necessary. Closing this ticket. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Datasette doesn't reload when database file changes 432870248  
495659567 https://github.com/simonw/datasette/issues/486#issuecomment-495659567 https://api.github.com/repos/simonw/datasette/issues/486 MDEyOklzc3VlQ29tbWVudDQ5NTY1OTU2Nw== simonw 9599 2019-05-24T14:41:45Z 2019-05-24T14:41:45Z OWNER I'm really keen to offer this as a plugin hook once I have Datasette working on ASGI - #272 I'll hopefully have that working in the next few weeks, but in the meantime there are a couple of tricks you can use: - you can add static HTML files (no templates though) using the static route configuration options - you can link to external hosted pages using the `about_url` metadata option - you can add information to an existing page with a custom template. I do that here for example: https://russian-ira-facebook-ads.datasettes.com/ {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Ability to add extra routes and related templates 448189298  
496786354 https://github.com/simonw/sqlite-utils/issues/21#issuecomment-496786354 https://api.github.com/repos/simonw/sqlite-utils/issues/21 MDEyOklzc3VlQ29tbWVudDQ5Njc4NjM1NA== simonw 9599 2019-05-29T05:09:01Z 2019-05-29T05:09:01Z OWNER Shipped this feature in sqlite-utils 1.1: https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v1-1 {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Option to ignore inserts if primary key exists already 448391492  
498839428 https://github.com/simonw/datasette/issues/498#issuecomment-498839428 https://api.github.com/repos/simonw/datasette/issues/498 MDEyOklzc3VlQ29tbWVudDQ5ODgzOTQyOA== simonw 9599 2019-06-04T20:53:21Z 2019-06-04T20:53:21Z OWNER It does not, but that's a really great idea for a feature. One challenge here is that FTS ranking calculations take overall table statistics into account, which means it's usually not possible to combine rankings from different tables in a sensible way. But that doesn't mean it's not possible to return grouped results. I think this makes a lot of sense as a plugin. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Full text search of all tables at once? 451513541  
498840129 https://github.com/simonw/datasette/issues/499#issuecomment-498840129 https://api.github.com/repos/simonw/datasette/issues/499 MDEyOklzc3VlQ29tbWVudDQ5ODg0MDEyOQ== simonw 9599 2019-06-04T20:55:30Z 2019-06-04T21:01:22Z OWNER I really want this too! It's one of the goals of the Datasette Library #417 concept, which I'm hoping to turn into an actual feature in the coming months. It's also going to be a major focus of my ten month JSK fellowship at Stanford, which starts in September. https://twitter.com/simonw/status/1123624552867565569 {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Accessibility for non-techie newsies?  451585764  
505057520 https://github.com/simonw/datasette/issues/527#issuecomment-505057520 https://api.github.com/repos/simonw/datasette/issues/527 MDEyOklzc3VlQ29tbWVudDUwNTA1NzUyMA== simonw 9599 2019-06-24T15:21:18Z 2019-06-24T15:21:18Z OWNER I just released csvs-to-sqlite 0.9.1 with this bug fix: https://github.com/simonw/csvs-to-sqlite/releases/tag/0.9.1 {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Unable to use rank when fts-table generated with csvs-to-sqlite 459936585  
505087020 https://github.com/simonw/datasette/pull/437#issuecomment-505087020 https://api.github.com/repos/simonw/datasette/issues/437 MDEyOklzc3VlQ29tbWVudDUwNTA4NzAyMA== simonw 9599 2019-06-24T16:38:56Z 2019-06-24T16:38:56Z OWNER Closing this because it doesn't really fit the new model of inspect (though we should discuss in #465 how to further evolve this feature) and because as-of #272 we no longer use Sanic - though #520 will implement the equivalent of `prepare_sanic` against ASGI. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Add inspect and prepare_sanic hooks 438048318  
509154312 https://github.com/simonw/datasette/issues/514#issuecomment-509154312 https://api.github.com/repos/simonw/datasette/issues/514 MDEyOklzc3VlQ29tbWVudDUwOTE1NDMxMg== JesperTreetop 4363711 2019-07-08T09:36:25Z 2019-07-08T09:40:33Z NONE @chrismp: Ports 1024 and under are privileged and can usually only be bound by a root or supervisor user, so it makes sense if you're running as the user `chris` that port 8000 works but 80 doesn't. See [this generic question-and-answer](https://superuser.com/questions/710253/allow-non-root-process-to-bind-to-port-80-and-443) and [this systemd question-and-answer](https://stackoverflow.com/questions/40865775/linux-systemd-service-on-port-80) for more information about ways to skin this cat. Without knowing your specific circumstances, either extending those privileges to that service/executable/user, proxying them through something like nginx or indeed looking at what the nginx systemd job has to do to listen at port 80 all sound like good ways to start. At this point, this is more generic systemd/Linux support than a Datasette issue, which is why a complete rando like me is able to contribute anything. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Documentation with recommendations on running Datasette in production without using Docker 459397625  
510550279 https://github.com/simonw/datasette/pull/556#issuecomment-510550279 https://api.github.com/repos/simonw/datasette/issues/556 MDEyOklzc3VlQ29tbWVudDUxMDU1MDI3OQ== simonw 9599 2019-07-11T16:07:27Z 2019-07-11T16:07:27Z OWNER This is a really neat trick, thanks! {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Add support for running datasette as a module 465773546  
511625212 https://github.com/simonw/datasette/pull/557#issuecomment-511625212 https://api.github.com/repos/simonw/datasette/issues/557 MDEyOklzc3VlQ29tbWVudDUxMTYyNTIxMg== simonw 9599 2019-07-16T01:12:14Z 2019-07-16T01:12:14Z OWNER This looks useful for dealing with the `The process cannot access the file because it is being used by another process` error: https://stackoverflow.com/a/28032829 {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Get tests running on Windows using Travis CI 466996584  
541931047 https://github.com/simonw/datasette/pull/595#issuecomment-541931047 https://api.github.com/repos/simonw/datasette/issues/595 MDEyOklzc3VlQ29tbWVudDU0MTkzMTA0Nw== simonw 9599 2019-10-14T21:25:38Z 2019-10-14T21:25:38Z OWNER I like the conditional dependency for the moment - maybe until 3.5 becomes officially unsupported. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} bump uvicorn to 0.9.0 to be Python-3.8 friendly 506300941  
544335363 https://github.com/dogsheep/twitter-to-sqlite/issues/20#issuecomment-544335363 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20 MDEyOklzc3VlQ29tbWVudDU0NDMzNTM2Mw== simonw 9599 2019-10-21T03:32:04Z 2019-10-21T03:32:04Z MEMBER In case anyone is interested, here's an extract from the crontab I'm running these under at the moment: ``` 1,11,21,31,41,51 * * * * /home/ubuntu/datasette-venv/bin/twitter-to-sqlite user-timeline /home/ubuntu/twitter.db -a /home/ubuntu/auth.json --since 2,7,12,17,22,27,32,37,42,47,52,57 * * * * /home/ubuntu/datasette-venv/bin/twitter-to-sqlite home-timeline /home/ubuntu/timeline.db -a /home/ubuntu/auth.json --since 6,16,26,36,46,56 * * * * /home/ubuntu/datasette-venv/bin/twitter-to-sqlite favorites /home/ubuntu/twitter.db -a /home/ubuntu/auth.json --stop_after=50 ``` {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} --since support for various commands for refresh-by-cron 506268945  
549435364 https://github.com/simonw/sqlite-utils/issues/62#issuecomment-549435364 https://api.github.com/repos/simonw/sqlite-utils/issues/62 MDEyOklzc3VlQ29tbWVudDU0OTQzNTM2NA== simonw 9599 2019-11-04T16:30:34Z 2019-11-04T16:30:34Z OWNER Released as 1.12. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} [enhancement] Method to delete a row in python 500783373  
549665423 https://github.com/simonw/datasette/issues/567#issuecomment-549665423 https://api.github.com/repos/simonw/datasette/issues/567 MDEyOklzc3VlQ29tbWVudDU0OTY2NTQyMw== simonw 9599 2019-11-05T05:11:14Z 2019-11-05T05:11:14Z OWNER @clausjuhl I wrote a bit about that here: https://simonwillison.net/2019/May/19/datasette-0-28/ Short version: just point Datasette at a SQLite file and update it from another process - it should work fine! I do it all the time now - I'll have a script running that writes to a database and I'll use Datasette to monitor progress. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Datasette Edit 476573875  
552275668 https://github.com/simonw/datasette/pull/595#issuecomment-552275668 https://api.github.com/repos/simonw/datasette/issues/595 MDEyOklzc3VlQ29tbWVudDU1MjI3NTY2OA== simonw 9599 2019-11-11T03:09:43Z 2019-11-11T03:09:43Z OWNER Glitch has been upgraded to Python 3.7. I think I'm happy to drop 3.5 support now - users who want Python 3.5 can get it by installing `datasette==0.30.2` {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} bump uvicorn to 0.9.0 to be Python-3.8 friendly 506300941  
561022224 https://github.com/simonw/datasette/issues/646#issuecomment-561022224 https://api.github.com/repos/simonw/datasette/issues/646 MDEyOklzc3VlQ29tbWVudDU2MTAyMjIyNA== simonw 9599 2019-12-03T06:30:42Z 2019-12-03T06:30:42Z OWNER I don't think this is possible at the moment but you're right, it totally should be. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Make database level information from metadata.json available in the index.html template 531502365  
570930239 https://github.com/simonw/sqlite-utils/issues/73#issuecomment-570930239 https://api.github.com/repos/simonw/sqlite-utils/issues/73 MDEyOklzc3VlQ29tbWVudDU3MDkzMDIzOQ== simonw 9599 2020-01-05T17:15:18Z 2020-01-05T17:15:18Z OWNER I think this is because you forgot to include a `pk=` argument. I'll change the code to throw a more useful error in this case. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} upsert_all() throws issue when upserting to empty table 545407916  
579787057 https://github.com/simonw/datasette/issues/662#issuecomment-579787057 https://api.github.com/repos/simonw/datasette/issues/662 MDEyOklzc3VlQ29tbWVudDU3OTc4NzA1Nw== simonw 9599 2020-01-29T14:43:46Z 2020-01-29T14:43:46Z OWNER Can you share the exact queries you're having trouble with? The SQL itself or even just the full URL to the page (it doesn't matter if it's to a Datasette instance that isn't available online - I just need to see the URL parameters). {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Escape_fts5_query-hookimplementation does not work with queries to standard tables 556814876  
579798917 https://github.com/simonw/datasette/issues/662#issuecomment-579798917 https://api.github.com/repos/simonw/datasette/issues/662 MDEyOklzc3VlQ29tbWVudDU3OTc5ODkxNw== clausjuhl 2181410 2020-01-29T15:08:57Z 2020-01-29T15:08:57Z NONE Hi Simon Thankt you for a quick reply. Here are a few examples of urls, where I search the 'cases_fts'-virtual table for tokens in the title-column. It returns the same results, wether the other query-params are present or not. Searching for sky http://localhost:8001/db-7596a4e/cases?_search_title=sky&year__gte=1997&year__lte=2017&_sort_desc=last_deliberation_date Returns searchresults Searching for sky* http://localhost:8001/db-7596a4e/cases?_search_title=sky*&year__gte=1997&year__lte=2017&_sort_desc=last_deliberation_date Returns searchresults Searching for sky-tog http://localhost:8001/db-7596a4e/cases?_search_title=sky-tog&year__gte=1997&year__lte=2017&_sort_desc=last_deliberation_date Throws: No such column: tog searching for sky+ http://localhost:8001/db-7596a4e/cases?_search_title=sky%2B&year__gte=1997&year__lte=2017&_sort_desc=last_deliberation_date Throws: Invalid SQL: fts5: syntax error near "" Searching for "madpakke" (including double quotes) http://localhost:8001/db-7596a4e/cases?_search_title=%22madpakke%22&year__gte=1997&year__lte=2017&_sort_desc=last_deliberation_date Returns searchresults even though 'madpakke' only appears in the fulltextindex without quotes As I said, my other plugins work just fine, and I just copied your sql_functions.py from the datasette-repo. Thanks! {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Escape_fts5_query-hookimplementation does not work with queries to standard tables 556814876  
579832857 https://github.com/simonw/datasette/issues/662#issuecomment-579832857 https://api.github.com/repos/simonw/datasette/issues/662 MDEyOklzc3VlQ29tbWVudDU3OTgzMjg1Nw== simonw 9599 2020-01-29T16:12:08Z 2020-01-29T16:12:08Z OWNER I think I see what's happening here. Adding the new plugin isn't quite enough: the change I made to master also alters the table view code to call the new function: https://github.com/simonw/datasette/commit/3c861f363df02a59a67c59036278338e4760d2ed#diff-5e0ffd62fced7d46339b9b2cd167c2f9 If you add the escape function as a plugin in Datasette 0.33 you will have to use a custom SQL query to run it, like this: https://latest.datasette.io/fixtures?sql=select+pk%2C+text1%2C+text2%2C+%5Bname+with+.+and+spaces%5D+from+searchable+where+rowid+in+%28select+rowid+from+searchable_fts+where+searchable_fts+match+escape_fts%28%3Asearch%29%29+order+by+pk+limit+101&search=Dog Or you can hold out for Datasette 0.34 which will have this fix and will hopefully ship within the next 24 hours. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Escape_fts5_query-hookimplementation does not work with queries to standard tables 556814876  
579864036 https://github.com/simonw/datasette/issues/662#issuecomment-579864036 https://api.github.com/repos/simonw/datasette/issues/662 MDEyOklzc3VlQ29tbWVudDU3OTg2NDAzNg== clausjuhl 2181410 2020-01-29T17:17:01Z 2020-01-29T17:17:01Z NONE This is excellent news. I'll wait until version 0.34. It would be tiresome to rewrite all standard-queries into custom queries. Thank you! {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Escape_fts5_query-hookimplementation does not work with queries to standard tables 556814876  
580028593 https://github.com/simonw/datasette/issues/661#issuecomment-580028593 https://api.github.com/repos/simonw/datasette/issues/661 MDEyOklzc3VlQ29tbWVudDU4MDAyODU5Mw== simonw 9599 2020-01-30T00:30:04Z 2020-01-30T00:30:04Z OWNER This has now shipped as part of Datasette 0.34: https://datasette.readthedocs.io/en/stable/changelog.html#v0-34 {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} --port option to expose a port other than 8001 in "datasette package" 555832585  
580029288 https://github.com/simonw/datasette/issues/658#issuecomment-580029288 https://api.github.com/repos/simonw/datasette/issues/658 MDEyOklzc3VlQ29tbWVudDU4MDAyOTI4OA== simonw 9599 2020-01-30T00:32:43Z 2020-01-30T00:32:43Z OWNER Can you share how your file layout is working? You should have something like this: `static/app.css` - a CSS file Then run Datasette like this: `datasette my.db --static-dir=static:static/` Then `http://127.0.0.1:8001/static/app.css` should serve your CSS. Could you share the command you're using to deploy to Heroku? {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} How do I use the app.css as style sheet? 550293770  
581758728 https://github.com/simonw/datasette/issues/577#issuecomment-581758728 https://api.github.com/repos/simonw/datasette/issues/577 MDEyOklzc3VlQ29tbWVudDU4MTc1ODcyOA== simonw 9599 2020-02-04T06:11:53Z 2020-02-04T06:11:53Z OWNER For the moment I'm going to move it to `async def render_template()` on `datasette` but otherwise keep the implementation the same. The new signature will be: async def render_template(self, template, context=None, request=None, view_name=None): `template` can be a list of strings or a single string. If a list of strings a template will be selected from them. I'll reconsider the large list of default context variables later on in a separate ticket. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Utility mechanism for plugins to render templates 497171390  
589908912 https://github.com/simonw/datasette/issues/675#issuecomment-589908912 https://api.github.com/repos/simonw/datasette/issues/675 MDEyOklzc3VlQ29tbWVudDU4OTkwODkxMg== simonw 9599 2020-02-22T02:38:21Z 2020-02-22T02:38:21Z OWNER Interesting feature suggestion. My initial instinct was that this would be better handled using the layered nature of Docker - so build a Docker image with `datasette package` and then have a separate custom script which takes that image, copies in the extra data and outputs a new image. But... `datasette package` is already meant to be more convenient than messing around with Docker by hand like this - so actually having a `--copy` option like you describe here feels like it's within scope of what `datasette package` is meant to do. So yeah - if you're happy to design this I think it would be worth us adding. Small design suggestion: allow `--copy` to be applied multiple times, so you can do something like this: datasette package \ --copy ~/project/templates /templates \ --copy ~/project/README.md /README.md \ data.db Also since Click arguments can take multiple options I don't think you need to have the `:` in there - although if it better matches Docker's own UI it might be more consistent to have it. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} --cp option for datasette publish and datasette package for shipping additional files and directories 567902704  
590517338 https://github.com/simonw/datasette/issues/682#issuecomment-590517338 https://api.github.com/repos/simonw/datasette/issues/682 MDEyOklzc3VlQ29tbWVudDU5MDUxNzMzOA== simonw 9599 2020-02-24T19:51:21Z 2020-02-24T19:51:21Z OWNER I filed a question / feature request with Janus about supporting timeouts for `.get()` against async queues here: https://github.com/aio-libs/janus/issues/240 I'm going to move ahead without needing that ability though. I figure SQLite writes are _fast_, and plugins can be trusted to implement just fast writes. So I'm going to support either fire-and-forget writes (they get added to the queue and a task ID is returned) or have the option to block awaiting the completion of the write (using Janus) but let callers decide which version they want. I may add optional timeouts some time in the future. I am going to make both `execute_write()` and `execute_write_fn()` awaitable functions though, for consistency with `.execute()` and to give me flexibility to change how they work in the future. I'll also add a `block=True` option to both of them which causes the function to wait for the write to be successfully executed - defaults to `False` (fire-and-forget mode). {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Mechanism for writing to database via a queue 569613563  
590679273 https://github.com/simonw/datasette/pull/683#issuecomment-590679273 https://api.github.com/repos/simonw/datasette/issues/683 MDEyOklzc3VlQ29tbWVudDU5MDY3OTI3Mw== simonw 9599 2020-02-25T04:37:21Z 2020-02-25T04:37:21Z OWNER I'm happy with this now. I'm going to merge to master. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} .execute_write() and .execute_write_fn() methods on Database 570101428  
592399256 https://github.com/simonw/datasette/issues/675#issuecomment-592399256 https://api.github.com/repos/simonw/datasette/issues/675 MDEyOklzc3VlQ29tbWVudDU5MjM5OTI1Ng== simonw 9599 2020-02-28T08:09:12Z 2020-02-28T08:09:12Z OWNER Sure, `--cp` looks good to me. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} --cp option for datasette publish and datasette package for shipping additional files and directories 567902704  
602907207 https://github.com/simonw/datasette/issues/394#issuecomment-602907207 https://api.github.com/repos/simonw/datasette/issues/394 MDEyOklzc3VlQ29tbWVudDYwMjkwNzIwNw== wragge 127565 2020-03-23T23:12:18Z 2020-03-23T23:12:18Z CONTRIBUTOR This would also be useful for running Datasette in Jupyter notebooks on [Binder](https://mybinder.org/). While you can use [Jupyter-server-proxy](https://github.com/jupyterhub/jupyter-server-proxy) to access Datasette on Binder, the links are broken. Why run Datasette on Binder? I'm developing a [range of Jupyter notebooks](https://glam-workbench.github.io/) that are aimed at getting humanities researchers to explore data from libraries, archives, and museums. Many of them are aimed at researchers with limited digital skills, so being able to run examples in Binder without them installing anything is fantastic. For example, there are a [series of notebooks](https://glam-workbench.github.io/trove-harvester/) that help researchers harvest digitised historical newspaper articles from Trove. The metadata from this harvest is saved as a CSV file that users can download. I've also provided some extra notebooks that use Pandas etc to demonstrate ways of analysing and visualising the harvested data. But it would be really nice if, after completing a harvest, the user could spin up Datasette for some initial exploration of their harvested data without ever leaving their browser. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} base_url configuration setting 396212021  
603631640 https://github.com/simonw/datasette/issues/394#issuecomment-603631640 https://api.github.com/repos/simonw/datasette/issues/394 MDEyOklzc3VlQ29tbWVudDYwMzYzMTY0MA== simonw 9599 2020-03-25T04:19:08Z 2020-03-25T04:19:08Z OWNER Shipped in 0.39: https://datasette.readthedocs.io/en/latest/changelog.html#v0-39 {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} base_url configuration setting 396212021  
610076073 https://github.com/simonw/datasette/issues/717#issuecomment-610076073 https://api.github.com/repos/simonw/datasette/issues/717 MDEyOklzc3VlQ29tbWVudDYxMDA3NjA3Mw== simonw 9599 2020-04-06T22:47:21Z 2020-04-06T22:47:21Z OWNER I'm confident it's possible to create a plugin that deploys to Now v2 now. I'll do the rest of the work in a separate repo: https://github.com/simonw/datasette-publish-now {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} See if I can get Datasette working on Zeit Now v2 594189527  
617208503 https://github.com/simonw/datasette/issues/176#issuecomment-617208503 https://api.github.com/repos/simonw/datasette/issues/176 MDEyOklzc3VlQ29tbWVudDYxNzIwODUwMw== nkirsch 12976 2020-04-21T14:16:24Z 2020-04-21T14:16:24Z NONE @eads I'm interested in helping, if there's still a need... {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Add GraphQL endpoint 285168503  
618155472 https://github.com/simonw/datasette/issues/731#issuecomment-618155472 https://api.github.com/repos/simonw/datasette/issues/731 MDEyOklzc3VlQ29tbWVudDYxODE1NTQ3Mg== simonw 9599 2020-04-23T03:28:42Z 2020-04-23T03:28:56Z OWNER As an alternative to `--static` this could work by letting you create the following: - `static/css/` - `static/js/` Which would be automatically mounted at `/js/...` and `/css/...` Or maybe just mount `static/` at `/static/` instead? {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Option to automatically configure based on directory layout 605110015  
622279374 https://github.com/dogsheep/github-to-sqlite/issues/33#issuecomment-622279374 https://api.github.com/repos/dogsheep/github-to-sqlite/issues/33 MDEyOklzc3VlQ29tbWVudDYyMjI3OTM3NA== garethr 2029 2020-05-01T07:12:47Z 2020-05-01T07:12:47Z NONE I also go it working with: ```yaml run: echo ${{ secrets.github_token }} | github-to-sqlite auth ``` {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Fall back to authentication via ENV 609950090  
623807568 https://github.com/dogsheep/dogsheep-photos/issues/16#issuecomment-623807568 https://api.github.com/repos/dogsheep/dogsheep-photos/issues/16 MDEyOklzc3VlQ29tbWVudDYyMzgwNzU2OA== simonw 9599 2020-05-05T02:56:06Z 2020-05-05T02:56:06Z MEMBER I'm pretty sure this is what I'm after. The `groups` table has what looks like identified labels in the rows with category = 2025: <img width="1122" alt="words__groups__2_528_rows_where_where_category___2025" src="https://user-images.githubusercontent.com/9599/81031361-e0df4080-8e40-11ea-9060-6d850aa52140.png"> Then there's a `ga` table that maps groups to assets: <img width="304" alt="words__ga__633_653_rows" src="https://user-images.githubusercontent.com/9599/81031387-f48aa700-8e40-11ea-9a3d-da23903be928.png"> And an `assets` table which looks like it has one row for every one of my photos: <img width="645" alt="words__assets__40_419_rows" src="https://user-images.githubusercontent.com/9599/81031402-04a28680-8e41-11ea-8047-e9199d068563.png"> One major challenge: these UUIDs are split into two integer numbers, `uuid_0` and `uuid_1` - but the main photos database uses regular UUIDs like this: ![image](https://user-images.githubusercontent.com/9599/81031481-39164280-8e41-11ea-983b-005ced641a18.png) I need to figure out how to match up these two different UUID representations. I asked on Twitter if anyone has any ideas: https://twitter.com/simonw/status/1257500689019703296 {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Import machine-learning detected labels (dog, llama etc) from Apple Photos 612287234  
624797119 https://github.com/simonw/datasette/issues/758#issuecomment-624797119 https://api.github.com/repos/simonw/datasette/issues/758 MDEyOklzc3VlQ29tbWVudDYyNDc5NzExOQ== simonw 9599 2020-05-06T17:53:46Z 2020-05-06T17:53:46Z OWNER It's interesting to hear from someone who's using this feature - I'm considering moving it out into a plugin #647. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Question: Access to immutable database-path 612382643  
624821090 https://github.com/simonw/datasette/issues/757#issuecomment-624821090 https://api.github.com/repos/simonw/datasette/issues/757 MDEyOklzc3VlQ29tbWVudDYyNDgyMTA5MA== simonw 9599 2020-05-06T18:41:29Z 2020-05-06T18:41:29Z OWNER OK, I just released 0.41 with that and a bunch of other stuff: https://datasette.readthedocs.io/en/latest/changelog.html#v0-41 {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Question: Any fixed date for the release with the uft8-encoding fix? 612378203  
626395209 https://github.com/dogsheep/dogsheep-photos/issues/21#issuecomment-626395209 https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21 MDEyOklzc3VlQ29tbWVudDYyNjM5NTIwOQ== simonw 9599 2020-05-10T21:52:42Z 2020-05-10T21:52:42Z MEMBER Aha! It looks like I accidentally installed the old bplist into the same environment: ``` $ pip freeze | grep bpylist bpylist==0.1.4 bpylist2==3.0.0 ``` {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} bpylist.archiver.CircularReference: archive has a cycle with uid(13) 615474990  
626395781 https://github.com/dogsheep/dogsheep-photos/issues/21#issuecomment-626395781 https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21 MDEyOklzc3VlQ29tbWVudDYyNjM5NTc4MQ== simonw 9599 2020-05-10T21:57:09Z 2020-05-10T21:57:09Z MEMBER Yes, I just recreated my virtual environment from scratch and the error went away. The problem occurred when I ran `pip install datasette-bplist` in the same virtual environment - https://github.com/simonw/datasette-bplist/blob/master/setup.py depends on `bpylist` which is incompatible with `bpylist2`. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} bpylist.archiver.CircularReference: archive has a cycle with uid(13) 615474990  
650600606 https://github.com/simonw/datasette/pull/868#issuecomment-650600606 https://api.github.com/repos/simonw/datasette/issues/868 MDEyOklzc3VlQ29tbWVudDY1MDYwMDYwNg== simonw 9599 2020-06-27T18:44:28Z 2020-06-27T18:44:28Z OWNER This is really exciting! Thanks so much for looking into this. I'm interested in moving CI for this repo over to GitHub Actions, so I'd be fine with you getting this to work as an Action rather than through Travis. If you can get it working in Travis though I'll happily land that and figure out how to convert that to GitHub Actions later on. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} initial windows ci setup 646448486  
652520496 https://github.com/simonw/datasette/issues/877#issuecomment-652520496 https://api.github.com/repos/simonw/datasette/issues/877 MDEyOklzc3VlQ29tbWVudDY1MjUyMDQ5Ng== simonw 9599 2020-07-01T16:26:52Z 2020-07-01T16:26:52Z OWNER Tokens get verified by plugins. So far there's only one: https://github.com/simonw/datasette-auth-tokens - which has you hard-coding plugins in a configuration file. I have a issue there to add support for database-backed tokens too: https://github.com/simonw/datasette-auth-tokens/issues/1 {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Consider dropping explicit CSRF protection entirely? 648421105  
655652679 https://github.com/simonw/sqlite-utils/issues/121#issuecomment-655652679 https://api.github.com/repos/simonw/sqlite-utils/issues/121 MDEyOklzc3VlQ29tbWVudDY1NTY1MjY3OQ== tsibley 79913 2020-07-08T17:24:46Z 2020-07-08T17:24:46Z CONTRIBUTOR Better transaction handling would be really great. Some of my thoughts on implementing better transaction discipline are in https://github.com/simonw/sqlite-utils/pull/118#issuecomment-655239728. My preferences: - Each CLI command should operate in a single transaction so that either the whole thing succeeds or the whole thing is rolled back. This avoids partially completed operations when an error occurs part way through processing. Partially completed operations are typically much harder to recovery from gracefully and may cause inconsistent data states. - The Python API should be transaction-agnostic and rely on the caller to coordinate transactions. Only the caller knows how individual insert, create, update, etc operations/methods should be bundled conceptually into transactions. When the caller is the CLI, for example, that bundling would be at the CLI command-level. Other callers might want to break up operations into multiple transactions. Transactions are usually most useful when controlled at the application-level (like logging configuration) instead of the library level. The library needs to provide an API that's conducive to transaction use, though. - The Python API should provide a context manager to provide consistent transactions handling with more useful defaults than Python's `sqlite3` module. The latter issues implicit `BEGIN` statements by default for most DML (`INSERT`, `UPDATE`, `DELETE`, … but not `SELECT`, I believe), but **not** DDL (`CREATE TABLE`, `DROP TABLE`, `CREATE VIEW`, …). Notably, the `sqlite3` module doesn't issue the implicit `BEGIN` until the first DML statement. It _does not_ issue it when entering the `with conn` block, like other DBAPI2-compatible modules do. The `with conn` block for `sqlite3` only arranges to commit or rollback an existing transaction when exiting. Including DDL and `SELECT`s in transactions is important for operation consistency, though. There are several existing bugs.python.org tickets about this and future changes are in the works, but sql… {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Improved (and better documented) support for transactions 652961907  
655673896 https://github.com/simonw/sqlite-utils/issues/121#issuecomment-655673896 https://api.github.com/repos/simonw/sqlite-utils/issues/121 MDEyOklzc3VlQ29tbWVudDY1NTY3Mzg5Ng== simonw 9599 2020-07-08T18:08:11Z 2020-07-08T18:08:11Z OWNER I'm with you on most of this. Completely agreed that the CLI should do everything in a transaction. The one thing I'm not keen on is forcing calling code to explicitly start a transaction, for a couple of reasons: 1. It will break all of the existing code out there 2. It doesn't match to how I most commonly use this library - as an interactive tool in a Jupyter notebook, where I'm generally working against a brand new scratch database and any errors don't actually matter So... how about this: IF you wrap your code in a `with db:` block then the `.insert()` and suchlike methods expect you to manage transactions yourself. But if you don't use the context manager they behave like they do at the moment (or maybe a bit more sensibly). That way existing code works as it does today, lazy people like me can call `.insert()` without thinking about transactions, but people writing actual production code (as opposed to Jupyter hacks) have a sensible way to take control of the transactions themselves. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Improved (and better documented) support for transactions 652961907  
675718593 https://github.com/simonw/datasette/issues/942#issuecomment-675718593 https://api.github.com/repos/simonw/datasette/issues/942 MDEyOklzc3VlQ29tbWVudDY3NTcxODU5Mw== simonw 9599 2020-08-18T21:02:11Z 2020-08-18T21:02:24Z OWNER Easiest solution: if you provide column metadata it gets displayed above the table, something like on https://fivethirtyeight.datasettes.com/fivethirtyeight/antiquities-act%2Factions_under_antiquities_act <img width="500" alt="fivethirtyeight__antiquities-act_actions_under_antiquities_act__344_rows" src="https://user-images.githubusercontent.com/9599/90565187-57d3e700-e15b-11ea-89c8-0270e3040a50.png"> HTML `title=` tooltips are also added to the table headers, which won't be visible on touch devices but that's OK because the information is visible on the page already. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Support column descriptions in metadata.json 681334912  
683173375 https://github.com/simonw/sqlite-utils/pull/142#issuecomment-683173375 https://api.github.com/repos/simonw/sqlite-utils/issues/142 MDEyOklzc3VlQ29tbWVudDY4MzE3MzM3NQ== simonw 9599 2020-08-28T22:29:02Z 2020-08-28T22:29:02Z OWNER Yeah I think that failure is actually because there's a brand new release of Black out and it subtly changes some of the formatting rules. I'll merge this and then run Black against the entire codebase. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} insert_all(..., alter=True) should work for new columns introduced after the first 100 records 688386219  
691501132 https://github.com/dogsheep/twitter-to-sqlite/issues/50#issuecomment-691501132 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/50 MDEyOklzc3VlQ29tbWVudDY5MTUwMTEzMg== bcongdon 706257 2020-09-12T14:48:10Z 2020-09-12T14:48:10Z NONE This seems to be an issue even with larger values of `--stop_after`: ``` $ twitter-to-sqlite favorites twitter.db --stop_after=2000 Importing favorites [####################################] 198 $ ``` {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} favorites --stop_after=N stops after min(N, 200) 698791218  
693199049 https://github.com/simonw/sqlite-utils/issues/159#issuecomment-693199049 https://api.github.com/repos/simonw/sqlite-utils/issues/159 MDEyOklzc3VlQ29tbWVudDY5MzE5OTA0OQ== simonw 9599 2020-09-16T06:20:26Z 2020-09-16T06:20:26Z OWNER See #121 - I need to think harder about how this all interacts with transactions. You can do this: ```python with db.conn: db["mytable"].delete_where() ``` But that should be documented and maybe rethought. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} .delete_where() does not auto-commit (unlike .insert() or .upsert()) 702386948  
695896557 https://github.com/simonw/datasette/issues/970#issuecomment-695896557 https://api.github.com/repos/simonw/datasette/issues/970 MDEyOklzc3VlQ29tbWVudDY5NTg5NjU1Nw== simonw 9599 2020-09-21T04:40:12Z 2020-09-21T04:40:12Z OWNER The Python standard library has a module for this: https://docs.python.org/3/library/webbrowser.html {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} request an "-o" option on "datasette server" to open the default browser at the running url 705108492  
696163452 https://github.com/simonw/datasette/issues/670#issuecomment-696163452 https://api.github.com/repos/simonw/datasette/issues/670 MDEyOklzc3VlQ29tbWVudDY5NjE2MzQ1Mg== snth 652285 2020-09-21T14:46:10Z 2020-09-21T14:46:10Z NONE I'm currently using PostgREST to serve OpenAPI APIs off Postgresql databases. I would like to try out datasette once this becomes available on Postgres. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Prototoype for Datasette on PostgreSQL 564833696  
697973420 https://github.com/simonw/datasette/issues/619#issuecomment-697973420 https://api.github.com/repos/simonw/datasette/issues/619 MDEyOklzc3VlQ29tbWVudDY5Nzk3MzQyMA== obra 45416 2020-09-23T21:07:58Z 2020-09-23T21:07:58Z NONE I've just run into this after crafting a complex query and discovered that hitting back loses my query. Even showing me the whole bad query would be a huge improvement over the current status quo. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} "Invalid SQL" page should let you edit the SQL 520655983  
701627158 https://github.com/simonw/sqlite-utils/pull/178#issuecomment-701627158 https://api.github.com/repos/simonw/sqlite-utils/issues/178 MDEyOklzc3VlQ29tbWVudDcwMTYyNzE1OA== simonw 9599 2020-09-30T20:29:11Z 2020-09-30T20:29:11Z OWNER Thanks for the fix! {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Update README.md 709043182  
702265255 https://github.com/simonw/datasette/pull/986#issuecomment-702265255 https://api.github.com/repos/simonw/datasette/issues/986 MDEyOklzc3VlQ29tbWVudDcwMjI2NTI1NQ== simonw 9599 2020-10-01T16:51:45Z 2020-10-01T16:51:45Z OWNER Thanks for taking a look! The fix ended up being a little different from this because I still want to disable faceting on regular single primary keys (since faceting by those won't ever produce interesting results) - here's what I used: https://github.com/simonw/datasette/commit/5d6bc4c268f9f155e59561671f8617addd3e91bc {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Allow facet by primary keys, fixes #985 712889459  
712317638 https://github.com/simonw/datasette/issues/991#issuecomment-712317638 https://api.github.com/repos/simonw/datasette/issues/991 MDEyOklzc3VlQ29tbWVudDcxMjMxNzYzOA== simonw 9599 2020-10-19T17:30:56Z 2020-10-19T17:30:56Z OWNER https://biglocal.datasettes.com/ is one of my larger Datasettes in terms of number of databases. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Redesign application homepage 714377268  
712569695 https://github.com/simonw/datasette/issues/782#issuecomment-712569695 https://api.github.com/repos/simonw/datasette/issues/782 MDEyOklzc3VlQ29tbWVudDcxMjU2OTY5NQ== carlmjohnson 222245 2020-10-20T03:45:48Z 2020-10-20T03:46:14Z NONE I vote against headers. It has a lot of strikes against it: poor discoverability, new developers often don’t know how to use them, makes CORS harder, makes it hard to use eg with JQ, needs ad hoc specification for each bit of metadata, etc. The only advantage of headers is that you don’t need to do .rows, but that’s actually good as a data validation step anyway—if .rows is missing assume there’s an error and do your error handling path instead of parsing the rest. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Redesign default .json format 627794879  
715584579 https://github.com/simonw/datasette/pull/1044#issuecomment-715584579 https://api.github.com/repos/simonw/datasette/issues/1044 MDEyOklzc3VlQ29tbWVudDcxNTU4NDU3OQ== simonw 9599 2020-10-23T20:53:01Z 2020-10-23T20:53:01Z OWNER Thanks for this! {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Add minimum supported python 727916744  
715585140 https://github.com/simonw/datasette/pull/1043#issuecomment-715585140 https://api.github.com/repos/simonw/datasette/issues/1043 MDEyOklzc3VlQ29tbWVudDcxNTU4NTE0MA== simonw 9599 2020-10-23T20:54:29Z 2020-10-23T20:54:29Z OWNER Thanks. I'll push a source release of `asgi-csrf`. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Include LICENSE in sdist 727915394  
716048564 https://github.com/simonw/datasette/issues/1033#issuecomment-716048564 https://api.github.com/repos/simonw/datasette/issues/1033 MDEyOklzc3VlQ29tbWVudDcxNjA0ODU2NA== simonw 9599 2020-10-24T20:08:31Z 2020-10-24T20:08:31Z OWNER Documentation here: https://docs.datasette.io/en/latest/internals.html#datasette-urls {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} datasette.urls.static_plugins(...) method 725099777  
717359145 https://github.com/simonw/sqlite-utils/pull/189#issuecomment-717359145 https://api.github.com/repos/simonw/sqlite-utils/issues/189 MDEyOklzc3VlQ29tbWVudDcxNzM1OTE0NQ== adamwolf 35681 2020-10-27T16:20:32Z 2020-10-27T16:20:32Z CONTRIBUTOR No problem. I added a test. Let me know if it looks sufficient or if you want me to to tweak something! If you don't mind, would you tag this PR as "hacktoberfest-accepted"? If you do mind, no problem and I'm sorry for asking :) My kiddos like the shirts. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Allow iterables other than Lists in m2m records 729818242  
718342036 https://github.com/simonw/datasette/issues/1050#issuecomment-718342036 https://api.github.com/repos/simonw/datasette/issues/1050 MDEyOklzc3VlQ29tbWVudDcxODM0MjAzNg== simonw 9599 2020-10-29T03:49:57Z 2020-10-29T03:49:57Z OWNER @thadk from that error it looks like the problem may have been that you had a BLOB column containing a `null` value? If so that's definitely a bug, I'll fix that. {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Switch to .blob render extension for BLOB downloads 729057388  
726412057 https://github.com/simonw/datasette/issues/865#issuecomment-726412057 https://api.github.com/repos/simonw/datasette/issues/865 MDEyOklzc3VlQ29tbWVudDcyNjQxMjA1Nw== simonw 9599 2020-11-12T23:49:23Z 2020-11-12T23:49:23Z OWNER @tballison thanks, I've split that out into a new issue #1091 {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} base_url doesn't seem to work when adding criteria and clicking "apply" 644582921  
735443626 https://github.com/simonw/datasette/issues/1114#issuecomment-735443626 https://api.github.com/repos/simonw/datasette/issues/1114 MDEyOklzc3VlQ29tbWVudDczNTQ0MzYyNg== simonw 9599 2020-11-29T19:40:49Z 2020-11-29T19:40:49Z OWNER Fix is out in 0.52.1: https://docs.datasette.io/en/latest/changelog.html#v0-52-1 {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} --load-extension=spatialite not working with datasetteproject/datasette docker image 752966476  
737463116 https://github.com/simonw/datasette/issues/942#issuecomment-737463116 https://api.github.com/repos/simonw/datasette/issues/942 MDEyOklzc3VlQ29tbWVudDczNzQ2MzExNg== simonw 9599 2020-12-02T20:02:10Z 2020-12-02T20:03:01Z OWNER My idea is that if you installed my proposed plugin you wouldn't need `metadata.json` at all - your metadata would instead live in a table in the connected SQLite database files - either one table per database (so the metadata can live in the same place as the data) or maybe also in a dedicated separate database file, for if you want to add metadata to an otherwise read-only database. The plugin would then provide a UI for editing that metadata - maybe by configuring some writable canned queries or maybe something more custom than that. Or you could edit the metadata by manually editing the SQLite database file (or loading data into it using a tool like [yaml-to-sqlite](https://github.com/simonw/yaml-to-sqlite)). {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} Support column descriptions in metadata.json 681334912  

Next page

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);
Powered by Datasette · Queries took 197.947ms · About: simonw/datasette-graphql