Skip to content

Provide better experience accessing county-level data #1619

Open
@brookslogan

Description

@brookslogan

This query

dat <- epidatr::pub_covidcast("doctor-visits", "smoothed_adj_cli", "county", "day")

seems to produce a nondeterministic mix of problems:

  • "Operation timed out after 900006 milliseconds with 0 bytes received" (on @XuedaShen's system)
  • Raising an error because of low bytes/sec (on my system)
  • (Potentially making the system unresponsive and then) getting killed by out-of-memory (OOM) killer (on my system). I'm estimating (from state data * rough # counties per state) that data as an object would be ~0.7GB, and as a JSON string is around ~1.5GB; my system as ~32GB RAM, with 8.5GB "free" at the moment, so hypothetically it should be able to handle it in RAM without OOM killing, but disk might be an option too.

The last one seems like something epidatr could potentially help out with; I've opened a companion issue there. The first two seem like potential API server issues or query performance issues.

Directly accessing the associated URL on Firefox is making progress, outputting a whole bunch of plaintext output; I assume at the end it will be formatted into a fancy json view, but I would prefer it to just save to a file when the output's this large. I assume there's some header that could be set to make this happen?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions