simple-salesforce
simple-salesforce copied to clipboard
Multi Record Insert Support - Pull Request - Enhancement
self.multi_url = (
u'https://{instance}/services/data/v{sf_version}/composite/tree'
'/{object_name}/'.format(instance=sf_instance,
object_name=object_name,
sf_version=sf_version))
def create_more(self, data, headers=None):
"""Creates a new SObject using a POST to `.../{object_name}/`.
Uses different URI to allow inserting of multiple records at once.
Must specify type and referenceid within a nested dictionary with
each record, wrapped in a list saved as a dict. eg:
{'records': [
{'attributes': {'type': 'Lead', 'referenceId': 'ref1'}, 'Company': 'Test'},
{'attributes': {'type': 'Lead', 'referenceId': 'ref2'}, 'Company': 'Test1'} ] }
Returns a dict decoded from the JSON payload returned by Salesforce.
Arguments:
* data -- a dict of the data to create the SObject from. It will be
JSON-encoded before being transmitted.
* headers -- a dict with additional request headers.
"""
result = self._call_salesforce(
method='POST', url=self.multi_url,
data=json.dumps(data), headers=headers
)
return result.json(object_pairs_hook=OrderedDict)
data = {'records': [{'attributes': {'type': 'Lead', 'referenceId': 'record1'},
'Company': 'Test',
'FirstName': 'Jane',
'LastName': 'Doe'},
{'attributes': {'type': 'Lead', 'referenceId': 'record2'},
'Company': 'Test1',
'FirstName': 'Paul',
'LastName': 'Test'}]}
sf.Lead.create_more(data)
Out[26]:
OrderedDict([('hasErrors', False),
('results',
[OrderedDict([('referenceId', 'record1'),
('id', '00Q21000006ddTGEAY')]),
OrderedDict([('referenceId', 'record2'),
('id', '00Q21000006ddTHEAY')])])])
I defined as a separate function in this case, and left the building of the json to the user. However, I think we could test for a list or dict, and process a list of records, building out the json and assigning 'type' and 'referancedId' within the function instead. This would keep everything under one function.
Thoughts?
@nickcatal
@tyler555g This looks promising.
I am using Simple-Salesforce for a lot of bulk inserts and going above the platform limit of 15000 batches and getting errors.
I have a lot of simple inserts with 2 or more records sometimes and the bulk api usage for those is consuming the limit.
@jon-wobken is this something that can be added to this project?
I like this a lot - planning to do a full review and test of all the pull request between 12/20-12/31.
If memory serves there is a record limit for the composite url. My preference would be to catch this beforehand if user is attempting to include more records than the API is expecting as well as only have the user pass a dictionary with the fields and data then generate the attribute in the function if possible.
@jon-wobken The limit is 200 records. At least we can check if the records to be inserted is less than or equal to 200, we could use this api call or else we could use bulk.
Thank you for your consideration of this idea.
Correct. Limit is 200 records. I have learned much since I originally wrote this. I will take a look in the AM EST and throw some polish on it.
Apparently I just opened this issue and didn't submit a proper PR. I will get this cleaned up with some more polish and throw in a PR. This is also supported with update and delete, so I will implement those as well. @jon-wobken I could add in auto batching of records in chunks of 200 similar to how the query_more is implemented or should we leave that to the user?
@tyler555g @jon-wobken I think you should also consider adding upsert to it. That would make it complete and very helpful.
https://developer.salesforce.com/docs/atlas.en-us.api_rest.meta/api_rest/resources_composite_sobjects_collections_upsert.htm
@jon-wobken I found another Pending PR that is also dealing with composite feature.
https://github.com/candlerb/simple-salesforce/tree/candlerb/235
Hey how's it going? Is this still being worked on?
Any reason not to complete this? This seems like a critical missing feature.
The BULK API is actually much less efficient when dealing with less than 5k records.