Batching #365
-
So yesterday I wrote up some thoughts on batching (see https://github.com/tannerlinsley/react-query/discussions/364), but this was too eager of me. For proper support of batching, it would be really working against the grain of the library. Some examples of the weird situations I find myself in due to trying to "wrap" it around:
This all brings me to the "obvious" conclusion that batching, if at all, should be implemented in react-query itself, not in my wrapper logic. I've seen this elsewhere as well, in the graphql world where people are trying to get data specifications + necessary queries to be a part of component design. Then, e.g., any number of queries that are requested within 50 ms of each other would be "batched" by default. I think this would be generally doable, apart from effort (which I'm willing to put in), and it would amount to:
The one major design problem seems to be that react-query is totally agnostic of the query function. A developer might choose to use radically different query functions for different calls. As a result, it almost doesn't make sense at all to just install some global option like Another design could maybe be to supercharge the query function argument. E.g.:
An example: function Page() {
const { invoiceId } = useParams();
const { data: me } = useApi("me", myApiFetcher);
const { data: invoice } = useApi(["invoice", invoiceId], myApiFetcher);
}
function myApiFetcher(method, possibleParam) {
// implemented in some way
}
myApiFetcher.batch = (queries) => {
// e.g. queries[0] might be { key: ["me"] } and queries[1] = { key: ["invoice", 2] }
// implemented in some way
}; Any thoughts? Is this something that you would be interested in seeing implemented in the short or long term? What would be a good API design? |
Beta Was this translation helpful? Give feedback.
Replies: 10 comments 31 replies
-
what are the benefits of batching? |
Beta Was this translation helpful? Give feedback.
-
Looking into this too. In our case, we can avoid the
I think your idea of looking for a
If RQ had this built-in, we'd be passing Thinking about how this would be designed, I came to the same conclusion that implementing this within RQ would make sense. It may be doable outside RQ but it still needs to orchestrate the query cache to track queries/batches. The way I was going to explore it was by creating a custom I made a visualization of how to think about this. In our case, it's a little more sophisticated because we essentially want to "combine" variables to send in one backend request and then distribute that response back to each query (probably using a selector or something equivalent). |
Beta Was this translation helpful? Give feedback.
-
I keep coming to this use case that I would love to do with react-query. I have thought about writing |
Beta Was this translation helpful? Give feedback.
-
I was able to write a batching mechanism by combining Key points:
What I like about this is that it lets react-query do what its great at, and achieves batching with just a thin layer on top. |
Beta Was this translation helpful? Give feedback.
-
I've implemented batching as well in react-admin, where multiple components querying one record by id on mount lead to a single query for an array of ids (see implementation). This is equivalent to using a DataLoader. It is working fine, but I have a performance problem. When the aggregated query (something like So my question is more about batched updates than batch queries: if 2 react-queries resolve in the same tick, how can I batch the updates so that React makes only 1 render? I tried using |
Beta Was this translation helpful? Give feedback.
-
I am working on a hook that batches the request for react-query cache misses on a list of ids. When I set If you call
|
Beta Was this translation helpful? Give feedback.
-
Just wrote a library to help with this exact issue using react query. Check it out at https://github.com/yornaath/batshit Here is an example: import { useQuery } from "react-query"
import { Batcher, windowScheduler, keyResolver } from "@yornaath/batshit"
type User = { id: number, name: string }
const users = Batcher<User, number>({
fetcher: async (ids) => {
return api.users.where({
userId_in: ids
})
},
resolver: keyResolver("id"),
scheduler: windowScheduler(10)
})
const useUser = (id: number) => {
return useQuery([ "users", id ], async () => {
return users.fetch(id)
})
}
const UserDetails = (props: {userId: number}) => {
const {isFetching, data} = useUser(props.userId)
return <>
{
isFetching ?
<div>Loading user {props.userId}</div>
:
<div>
User: {data.name}
</div>
}
</>
}
/**
* Since all user details items are rendered within the window there will only be one request made.
*/
const UserList = () => {
const userIds = [1,2,3,4]
return <>
{
userIds.map(id => <UserDetails userId={id} />)
}
</>
} |
Beta Was this translation helpful? Give feedback.
-
I am a bit curious why there is no built-in support for so common feature as batching. |
Beta Was this translation helpful? Give feedback.
-
Hey guys, if you're having problems trying to implement this with Suspense and dataloader, it may be due to an issue with concurrent react. Ive faced it recently and decided to write a small blog post: https://dev.to/tsirlucas/integrating-dataloader-with-concurrent-react-53h1 If you're using bashit, probably their default 10ms scheduler already fixes the issue. |
Beta Was this translation helpful? Give feedback.
I was able to write a batching mechanism by combining
react-query
withdataloader
, and it works remarkably well. I built a little demonstration CodeSandbox: https://codesandbox.io/s/react-query-dataloader-lgospKey points: