use-http
use-http copied to clipboard
Race condition on mounting multiple components with requests
Describe the bug Race condition causes some requests to never be sent and thus are loading forever
⚠️ Make a Codesandbox ⚠️ https://codesandbox.io/s/falling-shape-3gcsu
What I see, since this is a race condition and doesn't always trigger but does quite often for me.
To Reproduce Steps to reproduce the behavior:
- Create a few requests in flight
- Before the first few complete, mount more components that cause more requests.
- Note that some/all of the new requests never complete and are loading "forever".
- Note that sometimes when one of the new components is unmounted while in the forever loading state a react invariant exception occurs because setLoading(false) is called in a finally block which is not allowed on an unmounted component.
Expected behavior All requests should complete successfully.
You are seriously amazing for helping bullet proof this! ❤️ Again, I will get on this as soon as possible.
I tested the exact same code you have in the codesandbox at least 20 times on my local machine and was not able to reproduce the bug. I'll try some more, but it's possible this is an issue with codesandbox.
Update: I was able to reproduce.
Aha, here's the problem. The default cachePolicy
is cache-first
(meaning make 1 request, and use the cached response value every other request after that) and since you are making a bunch of the same requests, it's not making the requests, and returning the cached responses for each. I guess it's not able to store the 1st http request's response in the cache fast enough before the other 3 requests in the 1st batch are made 🐛. So that's why al the 1st 4 go true -> false
.
Now, when we switch the count to 50, since the request is the same for them all, and loading
is set to true
initially when we have a []
as the dependency, the cached response is returned without setting loading
to false because it doesn't make it that far. I'm still not 100% why the 1st 4 still remain false
while the rest are `true. Stay tuned.
Current workarounds until I get a fix for this are.
-
cachePolicy: 'no-cache'
function Thing() {
const { loading, error, data } = useFetch(`https://httpbin.org/post`, {
cachePolicy: 'no-cache',
headers: {
Accept: "application/json",
"Content-Type": "application/json"
},
method: "POST",
body: JSON.stringify({ width: 100 })
}, [])
return <div>Loading: {String(loading)}</div>
}
- Don't make the same request a lot of times. Shouldn't be making requests in an array like this anyway, but even if you had to, the requests would all look different using IDs from the various items. So your code would look more like
function Thing({ i }) {
const { loading, error, data } = useFetch(`https://httpbin.org/post`, {
headers: {
Accept: "application/json",
"Content-Type": "application/json"
},
method: "POST",
body: JSON.stringify({ width: i }) // where this would be an ID of whatever item this is
}, [])
console.log(`loading ${i}: `, loading)
return <div style={{ fontSize: 12 }}>Loading: {i} {String(loading)}</div>
}
const BugRaceCondition = () => {
const [more, setMore] = useState(false);
useTimeout(() => setMore(true), 1000)
return (
<Provider url='https://httpbin.org/delay/3'>
<div className="App">
<h1>Hello CodeSandbox</h1>
<h2>Start editing to see some magic happen!</h2>
{Array.from({ length: more ? 50 : 4 }, (x, i) => <Thing key={i} i={i} />)}
</div>
</Provider>
)
}
It is worth noting in my real application they are all different post requests (same endpoint, different body). At least they should be 🤔. They should be GETs but that's an open issue with the API writer.
For now I ended up making my own lightweight useFetch.
For what it is worth, all these requests are for sparklines in an infinite scroller. So when you scroll quickly random sparklines never finish loading.
Double check they are different bodies. The way the cache works is we save the key as url:https://your-endpoint.com/whatever||method:POST||body:YOUR_JSON_STRINGIFIED_BODY
and the body as the response body. (the cache body will change in the future to be the response
to the request, but for now it's "essentially" the response.json()
)
Since the key includes the body, if the bodies are different for each request, it should work. I will look into this some more asap.
I'm currently juggling some unrelated work, so it might be a bit before I get around to checking on the bodies.