vogels
vogels copied to clipboard
MissingRequiredParameter: Missing required key 'RequestItems' in params
Hello Geeks,
When performing batch get items in loop for cron jobs, we are getting below error
{ [MissingRequiredParameter: Missing required key 'RequestItems' in params] message: 'Missing required key \'RequestItems\' in params', code: 'MissingRequiredParameter', time: Wed Apr 27 2016 08:41:00 GMT+0000 (UTC) }
our code
var keyArr = [10001,10010,......]; PropertyModal.getItems(keyArr, function(err, propertyData) { if (err) { console.log(err); } else { } });
@ryanfitz Can you please support for this issue, It occurs randomly when have bulk of data.
Scenario :- Total records : 5000 so by aws standards we are using batch get 100 items in loop.
@avtaniket you should provide more information about the table schema.
This is from the documentation of Vogels:
DynamoDB limits the number of items you can get to 100 or 1MB of data for a single request.
Does your items' size exceeds 1MB? I guess the size of the items is not strictly defined but varies and that is way this error occurs randomly. Sometimes the batch size is greater than 1MB?
However if you are dumping all the 5000 records i think you should use different approach like scan
with .loadAll()
operand, or ParallelScan, or/and the streaming API.
Thanks for quick reply,
Yes, I understand size issue so I also tried with passing array of 50 IDs or even less.
For your kind info, I have property information in table where property ID is hash key only.
Does Vogels.getItems() uses different method than whats document in AWS (putIrequest/deleterequest)
Can't use scan as want to get bulk records by hask key (property id)
Suggestions are welcome...
Aniket Takarkhede 9766137912 On 18 Jun 2016 8:56 pm, "N. D." [email protected] wrote:
@avtaniket https://github.com/avtaniket you should provide more information about the table schema.
This is from the documentation of Vogels: DynamoDB limits the number of items you can get to 100 or 1MB of data for a single request. Does your items' size exceeds 1MB? I guess the size of the items is not strictly defined but varies and that is way this error occurs randomly. Sometimes the batch size is greater than 1MB?
However if you are dumping all the 5000 records i think you should use different approach like scan with .loadAll() operand, or ParallelScan, or/and the streaming API.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/ryanfitz/vogels/issues/166#issuecomment-226948182, or mute the thread https://github.com/notifications/unsubscribe/AJeTPNiCxw3ldegXGOkw8TJ6FdXu_XQ9ks5qNA4BgaJpZM4IQ1HR .
Please double check that you don't have any NULL or empty values in you keyArr
.
Also that every element in the array has type of integer.
You know that in JS var a = [123, ,456];
is a valid syntax.
You may check the values in keyArr
with the method Number.isInteger(value)
.
Reference.
I think the problem here is that BatchGetItems just doesn't work for >1mb results.
The problem is in the code here: https://github.com/ryanfitz/vogels/blob/master/lib/batch.js#L51
After you get UnprocessedKeys (due to 1mb limit), it tries to request further results automatically. However it doesn't build the subsequent request properly, as you can see here:
https://github.com/ryanfitz/vogels/blob/master/lib/batch.js#L13
it is supposed to set RequestItems:
as the base key, whereas it's not.
Original Request:
{
"name": "vogels",
"hostname": "blah",
"pid": 3652,
"model": "Model_name",
"level": 30,
"params": {
"RequestItems": {
"table_name": {
"Keys": [ { keys_go_here: ''} ]
}
}
},
"msg": "vogels BATCHGET request",
"time": "2017-03-22T03:55:35.890Z",
"v": 0
}
subsequent ones:
{
"name": "vogels",
"hostname": "blah",
"pid": 3652,
"model": "Model_name",
"level": 30,
"params": {
"table_name": {
"Keys": [ { keys_go_here: ''} ]
}
},
"msg": "vogels BATCHGET request",
"time": "2017-03-22T03:55:35.890Z",
"v": 0
}
@RodH257 Yes, I think you have correctly pointed out issue.
Alternate solution we did was, chunk records in smaller size and did loop to fetch all records.
Cool, I've added an issue on Dynogels (actively maintained fork of this repo) and submitted a PR there: https://github.com/clarkie/dynogels/issues/78