issue_comments: 812680519
This data as json
html_url | issue_url | id | node_id | user | created_at | updated_at | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
https://github.com/simonw/datasette/issues/1255#issuecomment-812680519 | https://api.github.com/repos/simonw/datasette/issues/1255 | 812680519 | MDEyOklzc3VlQ29tbWVudDgxMjY4MDUxOQ== | 1111743 | 2021-04-02T19:37:57Z | 2021-04-02T19:37:57Z | NONE | Hello, I'm also experiencing a timeout in my environment. I don't know if it's because I need more indexes or a more powerful system. My data has 1,271,111 and when I try to create a facet, there's a time out. I've tried this on two different rows that should significantly filter down data: `CITY` and `PARTY_REG`. Simon's johns_hopkins_csse_daily_reports has more rows and it setup with two facets on load. He does have four indexes created, though. Do I need more indexes? I have one simple one so far: ``` CREATE INDEX [idx_party_reg] ON [county_active] ([PARTY_REG]); ``` I'm running Datasette 0.56 installed via pip with Python 3.7.3. `4.19.0-10-amd64 #1 SMP Debian 4.19.132-1 (2020-07-24) x86_64 GNU/Linux` ``` $ cat /etc/os-release PRETTY_NAME="Debian GNU/Linux 10 (buster)" NAME="Debian GNU/Linux" VERSION_ID="10" VERSION="10 (buster)" VERSION_CODENAME=buster ``` | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | 826700095 |