Replies: 1 comment
-
I would see something like rate-limiting or even debouncing / throttling to need to happen on the api layer. React Query only cares about the produced promise - it is not tied to "data fetching". there are some similar discussions on related topics where the stance has always been that this will likely not make it into the library: to quote myself:
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
This could be a entire new library, but I think the idea really fits TanStack Query.
I noticed the problem while developing a SPA using Vue Query. Constantly hot reloading and refreshing the page usually ended up in a few
429 Too Many Requests
popping up. It's not impossible to implement a solution using some throttling, retryDelay, some external rate-limiting library, etc. but I just think that it would be much easier if we could quickly implement something right on useQuery that properly batched requests or delayed them according to a straightforward rate limit ruleset.Example:
My page has connections with two services: a Infinite Timeline one and a General one for everything else. They have the following rules:
To make TanStack Query respect these rules, I want to be able to just declare them like:
And what happens inside a
QueryLimiter
is that, if a query has been invoked and the query function would break one of the rate limits if called, the query gets in a queue until the limiter has room to call it.useQuery
could even get a newisQueued
state.I think the biggest advantage of this approach is that instead of needing to fine tune
staleTime
s around the application to try and reduce load on certain APIs, this just makes the application respect a sane limit defined by the API itself.Beta Was this translation helpful? Give feedback.
All reactions