html_url,issue_url,id,node_id,user,user_label,created_at,updated_at,author_association,body,reactions,issue,issue_label,performed_via_github_app https://github.com/simonw/datasette/issues/1198#issuecomment-766428183,https://api.github.com/repos/simonw/datasette/issues/1198,766428183,MDEyOklzc3VlQ29tbWVudDc2NjQyODE4Mw==,9599,simonw,2021-01-24T20:40:37Z,2021-01-24T20:40:37Z,OWNER,https://docs.datasette.io/en/latest/testing_plugins.html#testing-outbound-http-calls-with-pytest-httpx,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792625812,Plugin testing documentation on using pytest-httpx, https://github.com/simonw/datasette/issues/1197#issuecomment-766430111,https://api.github.com/repos/simonw/datasette/issues/1197,766430111,MDEyOklzc3VlQ29tbWVudDc2NjQzMDExMQ==,9599,simonw,2021-01-24T20:53:40Z,2021-01-24T20:53:40Z,OWNER,"https://devcenter.heroku.com/articles/slug-compiler#slug-size says that the maximum allowed size is 500MB - my hunch is that the Datasette application itself weighs in at only a dozen or so MB but I haven't measured it. So I would imagine anything up to around 450MB should work OK on Heroku. Cloud Run works for up to about 2GB in my experience.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",791381623,DB size limit for publishing with Heroku, https://github.com/simonw/datasette/issues/1190#issuecomment-766430644,https://api.github.com/repos/simonw/datasette/issues/1190,766430644,MDEyOklzc3VlQ29tbWVudDc2NjQzMDY0NA==,9599,simonw,2021-01-24T20:57:03Z,2021-01-24T20:57:03Z,OWNER,"I really like this idea. It feels like an opportunity for a plugin that adds two things: an API endpoint to Datasette for accepting uploaded databases, and a `datasette publish upload` subcommand which can upload files to that endpoint (with some kind of authentication mechanism).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098146,`datasette publish upload` mechanism for uploading databases to an existing Datasette instance, https://github.com/simonw/datasette/issues/1190#issuecomment-766433153,https://api.github.com/repos/simonw/datasette/issues/1190,766433153,MDEyOklzc3VlQ29tbWVudDc2NjQzMzE1Mw==,9599,simonw,2021-01-24T21:13:25Z,2021-01-24T21:13:25Z,OWNER,"This ties in to a bunch of other ideas that are in flight at the moment. If you're publishing databases by uploading them, how do you attach metadata? Ideally by baking it into the database file itself, using the mechanism from #1169. How could this interact with the `datasette insert` concept from #1163? Could you pass a CSV file to the `upload` command and have that converted and uploaded for you, or would you create the database file locally using `datasette insert` and then upload it as a separate `datasette upload` step? Lots to think about here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098146,`datasette publish upload` mechanism for uploading databases to an existing Datasette instance, https://github.com/simonw/datasette/issues/1179#issuecomment-766434629,https://api.github.com/repos/simonw/datasette/issues/1179,766434629,MDEyOklzc3VlQ29tbWVudDc2NjQzNDYyOQ==,9599,simonw,2021-01-24T21:23:47Z,2021-01-24T21:23:47Z,OWNER,I'm just going to do `path` and `full_path` (which includes the querystring)`. The `datasette.absolute_url()` method can be used by plugins that need the full URL.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks, https://github.com/simonw/datasette/issues/1154#issuecomment-766462197,https://api.github.com/repos/simonw/datasette/issues/1154,766462197,MDEyOklzc3VlQ29tbWVudDc2NjQ2MjE5Nw==,9599,simonw,2021-01-24T23:47:06Z,2021-01-24T23:47:06Z,OWNER,"I'm going to document this but mark it as unstable, using a new documentation convention for marking unstable APIs.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771208009,Documentation for new _internal database and tables, https://github.com/simonw/datasette/issues/1202#issuecomment-766462475,https://api.github.com/repos/simonw/datasette/issues/1202,766462475,MDEyOklzc3VlQ29tbWVudDc2NjQ2MjQ3NQ==,9599,simonw,2021-01-24T23:49:28Z,2021-01-24T23:50:33Z,OWNER,"Can use an ""admonition"" similar to this: ```sphinx .. warning:: Restricting access to tables and views in this way will NOT prevent users from querying them using arbitrary SQL queries, `like this `__ for example. ``` As seen on https://docs.datasette.io/en/stable/authentication.html#controlling-access-to-specific-tables-and-views Documentation: https://docutils.sourceforge.io/docs/ref/rst/directives.html#specific-admonitions","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792931244,Documentation convention for marking unstable APIs., https://github.com/simonw/datasette/issues/1090#issuecomment-766463496,https://api.github.com/repos/simonw/datasette/issues/1090,766463496,MDEyOklzc3VlQ29tbWVudDc2NjQ2MzQ5Ng==,9599,simonw,2021-01-24T23:57:00Z,2021-01-24T23:57:00Z,OWNER,Related: I built [datasette-leaflet-freedraw](https://datasette.io/plugins/datasette-leaflet-freedraw) which turns any canned query field called `freedraw` or `something_freedraw` into an interactive map that you can draw on to create a GeoJSON MultiPolygon.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",741862364,Custom widgets for canned query forms,