shareplum
shareplum copied to clipboard
Uploading files larger than 250MB does not work with current methods. : Uploading file cause error 400, Bad Request
Hi, I originally posted this as a reply to an old post, but not sure its "active" anymore. https://github.com/jasonrollins/shareplum/issues/80#issuecomment-668038392
Reposting here: I am having, what looks like, the same issue. My script is checking file versions and downloading them just fine. The issue occurs when trying to upload a file (it will be a new version of an existing file, so should be overwritten) I'm passing the site after my login func. (Same as for downloading)
`def upload_smc(site,userdata_dict):
folder=site.Folder(userdata_dict['fil_smc']['fil_url'])
print (folder.info)
smc_file=userdata_dict['loc_path']+'\\'+userdata_dict['fil_smc']['filnavn']
with open(smc_file,'rb') as file:
file_content=file.read()
folder.upload_file(file_content,userdata_dict['fil_smc']['filnavn'])
print (f"Updated SMC file, {userdata_dict['fil_smc']['filnavn']} is uploaded to {folder.folder_name} ")`
Args= [site] is passed from login and works for download- same variable [userdata_dict] is a dict with from json with stored data. (userdata_dict['fil_smc']['filnavn']="BB_K3_02_Sammenstillingsmodell.smc") (userdata_dict['fil_smc']['fil_url']="Delte dokumenter/Prosjekt K-TRE/07 - Detaljprosjekt/01 - BIM/02 IFC og SMC"
after running for a while i get the error
The function works if i run with a smaller file the file i have problems with is currently at 257MB
Edit: Script works with same file type with a size of 235MB so i'm thinking its connected to size. This post adresses an issue where it seems the file size limit is 250MB, but the limit should be 2GB. Its resolved by "chunking" (bit over my head this) https://github.com/SharePoint/sp-dev-docs/issues/4973
Edit: Research confirms that uploading files larger than 250MB cant be done with current methods. File chunking must be done and using:
sp_url/_api/web/GetFolderByServerRelativeUrl/Files/AddStubUsingPath()/StartUpload
sp_url/_api/web/GetFileByServerRelativeUrl()/ContinueUpload
sp_url/_api/web/GetFileByServerRelativeUrl()//FinishUpload
I've tried finding a way to work but my coding knowledge does not suffice.
Hey it works, I added this method by using your suggestions.
def upload_file_chunked(self, file, file_name, chunk_size=143217728):
headers = {'X-RequestDigest': self.contextinfo['FormDigestValue']}
content_path = os.path.abspath(file_name)
content_size = os.stat(content_path).st_size
index = 0
offset = 0
def read_in_chunks(file_object, chunk_size=chunk_size):
while True:
data = file_object.read(chunk_size)
if not data:
break
yield data
nr_chunks = int(content_size / chunk_size) - 1
self.upload_file(None, file_name)
guid = get(self._session, self.site_url + f"/_api/web/GetFileByServerRelativeUrl('{self.info['d']['ServerRelativeUrl']}/{file_name}')").json()['UniqueId']
for chunk in read_in_chunks(file):
if index == 0:
url = self.site_url + f"/_api/web/GetFileByServerRelativeUrl('{self.info['d']['ServerRelativeUrl']}/{file_name}')/StartUpload(uploadId='{guid}')"
elif index + len(chunk) == content_size:
url = self.site_url + f"/_api/web/GetFileByServerRelativeUrl('{self.info['d']['ServerRelativeUrl']}/{file_name}')/FinishUpload(uploadId='{guid}',fileOffset={offset})"
else:
url = self.site_url + f"/_api/web/GetFileByServerRelativeUrl('{self.info['d']['ServerRelativeUrl']}/{file_name}')/ContinueUpload(uploadId='{guid}',fileOffset={offset})"
offset = index + len(chunk)
headers['Content-Range'] = 'bytes %s-%s/%s' % (index, offset - 1, content_size)
index = offset
post(self._session, url=url, headers=headers, data=chunk, timeout=self.timeout)