fivebeans
fivebeans copied to clipboard
Release/Destroy fails to fire when using multiple Reserve in parallel
If i launch say 20 reserves at the same time, then doing release/destroy for each of them doesn't get called. If I lower the number of parallel requests to say, 9, it works fine.
Is there some type of edge case I'm hitting?
example: i have 1 thing in the queue. I spin up 20 requests to beantalk. The reserve works, but the destroy inside the reserve doesn't. If I only spin up 1 request, i can reserve/destroy just fine.
Maybe it has something to do with multiple connections? Here is a little test script. I want 20 fetchs / sec. The first second goes pretty ok, but the 2nd and on, only 1 task is reserved, and nothing is deleted.. If i put a console.log within the setInterval, i see it ticking just fine.
const beans = require('fivebeans') const client = new beans.client();
client.on('connect', () =>
{
client.watch('testTube4', function(err, tubename)
{
setInterval(() =>
{
for(let i = 0; i < 20;i++)
{
client.reserve(function(err, jobid, payload)
{
console.log(reserve: ${jobid}, err:${err}
);
client.destroy(jobid, function(err)
{
console.log(`destroy: ${jobid}, err:${err}`);
});
});
}
}, 1000);
});
}).connect();
1st try .. stops at 1 reserve and nothing else comes through. vagrant@packer-virtualbox-iso-1490879070:~/sync$ node test.js reserve: 101099, err:null reserve: 101100, err:null reserve: 101101, err:null reserve: 101102, err:null reserve: 101103, err:null reserve: 101104, err:null reserve: 101105, err:null reserve: 101106, err:null reserve: 101107, err:null reserve: 101108, err:null reserve: 101109, err:null reserve: 101110, err:null reserve: 101111, err:null reserve: 101112, err:null reserve: 101113, err:null reserve: 101114, err:null reserve: 101115, err:null reserve: 101116, err:null reserve: 101117, err:null reserve: 101118, err:null destroy: 101099, err:null destroy: 101100, err:null destroy: 101101, err:null destroy: 101102, err:null destroy: 101103, err:null destroy: 101104, err:null destroy: 101105, err:null destroy: 101106, err:null destroy: 101107, err:null destroy: 101108, err:null destroy: 101109, err:null destroy: 101110, err:null destroy: 101111, err:null destroy: 101112, err:null destroy: 101113, err:null destroy: 101114, err:null destroy: 101115, err:null destroy: 101116, err:null destroy: 101117, err:null destroy: 101118, err:null reserve: 101119, err:null reserve: 101120, err:null reserve: 101121, err:null reserve: 101122, err:null reserve: 101123, err:null reserve: 101124, err:null reserve: 101125, err:null reserve: 101126, err:null reserve: 101127, err:null reserve: 101128, err:null reserve: 101129, err:null reserve: 101130, err:null reserve: 101131, err:null reserve: 101132, err:null reserve: 101133, err:null reserve: 101134, err:null reserve: 101135, err:null reserve: 101136, err:null reserve: 101137, err:null reserve: 101138, err:null destroy: 101119, err:null destroy: 101120, err:null destroy: 101121, err:null destroy: 101122, err:null destroy: 101123, err:null destroy: 101124, err:null destroy: 101125, err:null destroy: 101126, err:null destroy: 101127, err:null destroy: 101128, err:null destroy: 101129, err:null destroy: 101130, err:null destroy: 101131, err:null destroy: 101132, err:null destroy: 101133, err:null destroy: 101134, err:null destroy: 101135, err:null destroy: 101136, err:null destroy: 101137, err:null destroy: 101138, err:null reserve: 101139, err:null
Still doing 20 at the same time, only 1 in the tube, but destroy doesn't trigger. vagrant@packer-virtualbox-iso-1490879070:~/sync$ node test.js reserve: 101139, err:null
Still doing 20 at the same time, only 1 in the tube, but destroy doesn't trigger. vagrant@packer-virtualbox-iso-1490879070:~/sync$ node test.js reserve: 101139, err:null
I changed it to do only 1 at a time and now the destroy works?! vagrant@packer-virtualbox-iso-1490879070:~/sync$ node test.js reserve: 101139, err:null destroy: 101139, err:null
For now, i think i just had to work around it.. instead of X parallel, i just do 1 at a time with less delays.
Closing, clearly this repo is no longer maintained.