msgraph-sdk-python-core
msgraph-sdk-python-core copied to clipboard
largefileuploadtask Python
Hello Team ,
I would like to get sample code largefileupload task to send mail and upload file in the onedrive/sharepoint placed in samples folder or updated in ms documentation.
Kindly request you to update the sample code in a separate tab for python with the documentation URL.
Ref#: https://docs.microsoft.com/en-us/graph/sdks/large-file-upload?tabs=csharp
Note : Please provide two separate code that demonstrate files sending greater than 4 mb and secondly, upload cide to 1Drive.
Kindly request you to help me as it is a new requirement for business purpose.
Thank you.
Regards , Libin
Hi @R-LIBIN accidentally stumbled upon this ticket. what you want is quite easy to do: here is an example where a "response.content" will be uploaded to SharePoint if the size is more than 4.1MB the file is streamed (uploaded in chunks) as it cannot be uploaded via single post request:
import os
import tempfile
from azure.identity import ClientSecretCredential
from msgraph.core import GraphClient
credential = ClientSecretCredential(
tenant_id=tenant, client_id=client_id, client_secret=client_secret
)
client = GraphClient(credential=credential)
sharepoint_url = f"https://graph.microsoft.com/v1.0/drives/{driveid}/items/{itemid}:/{foldername}/{filename}"
if len(response.content) < 4100000:
print("File is smaller than 4.1MB")
data_upload = client.put(
sharepoint_url + ":/content",
data=response.content,
headers={"Content-Type": "application/octet-stream"},
)
print(data_upload)
print(data_upload.json())
else:
print("File is bigger than 4.1MB")
fd, path = tempfile.mkstemp()
try:
with os.fdopen(fd, "wb") as tmp:
tmp.write(response.content)
# Creating an upload session
upload_session = client.post(
sharepoint_url + ":/createUploadSession",
headers={"Content-Type": "application/json"},
).json()
with open(path, "rb") as f:
total_file_size = os.path.getsize(path)
chunk_size = 327680
chunk_number = total_file_size // chunk_size
chunk_leftover = total_file_size - chunk_size * chunk_number
i = 0
while True:
chunk_data = f.read(chunk_size)
start_index = i * chunk_size
end_index = start_index + chunk_size
# If end of file, break
if not chunk_data:
break
if i == chunk_number:
end_index = start_index + chunk_leftover
# Setting the header with the appropriate chunk data location in the file
headers = {
"Content-Type": "application/octet-stream",
"Content-Length": "{}".format(chunk_size),
"Content-Range": "bytes {}-{}/{}".format(
start_index, end_index - 1, total_file_size
),
}
# Upload one chunk at a time
chunk_data_upload = client.put(
upload_session["uploadUrl"],
data=chunk_data,
headers=headers,
)
print(chunk_data_upload)
print(chunk_data_upload.json())
i = i + 1
except Exception as error:
print(error)
finally:
os.remove(path)
Thanks!. Stephen.
On Wed, 10 Jan, 2024, 1:29 pm Goldener Stefan, @.***> wrote:
Hi @R-LIBIN https://github.com/R-LIBIN accidentally stumbled upon this ticket. what you want is quite easy to do: here is an example where a "response.content" will be uploaded to sharepoint if the size is more than 4.1MB:
import os import tempfile
from azure.identity import ClientSecretCredential from msgraph.core import GraphClient
credential = ClientSecretCredential( tenant_id=tenant, client_id=client_id, client_secret=client_secret ) client = GraphClient(credential=credential) sharepoint_url = f"https://graph.microsoft.com/v1.0/drives/{driveid}/items/{itemid}:/{partner_id} - {partner_name}/{filename}"
if len(response.content) < 4100000: print("File is smaller than 4.1MB") data_upload = client.put( sharepoint_url + ":/content", data=response.content, headers={"Content-Type": "application/octet-stream"}, )
print(data_upload) print(data_upload.json())else: print("File is bigger than 4.1MB") fd, path = tempfile.mkstemp()
try: with os.fdopen(fd, "wb") as tmp: tmp.write(response.content) # Creating an upload session upload_session = client.post( sharepoint_url + ":/createUploadSession", headers={"Content-Type": "application/json"}, ).json() with open(path, "rb") as f: total_file_size = os.path.getsize(path) chunk_size = 327680 chunk_number = total_file_size // chunk_size chunk_leftover = total_file_size - chunk_size * chunk_number i = 0 while True: chunk_data = f.read(chunk_size) start_index = i * chunk_size end_index = start_index + chunk_size # If end of file, break if not chunk_data: break if i == chunk_number: end_index = start_index + chunk_leftover # Setting the header with the appropriate chunk data location in the file headers = { "Content-Type": "application/octet-stream", "Content-Length": "{}".format(chunk_size), "Content-Range": "bytes {}-{}/{}".format( start_index, end_index - 1, total_file_size ), } # Upload one chunk at a time chunk_data_upload = client.put( upload_session["uploadUrl"], data=chunk_data, headers=headers, ) print(chunk_data_upload) print(chunk_data_upload.json()) i = i + 1 except Exception as error: print(error) finally: os.remove(path)— Reply to this email directly, view it on GitHub https://github.com/microsoftgraph/msgraph-sdk-python-core/issues/137#issuecomment-1884355965, or unsubscribe https://github.com/notifications/unsubscribe-auth/AEBCZFPJMNVJ24355FOTV53YNZC65AVCNFSM57RJBOFKU5DIOJSWCZC7NNSXTN2JONZXKZKDN5WW2ZLOOQ5TCOBYGQZTKNJZGY2Q . You are receiving this because you were mentioned.Message ID: @.***>
Please update in the documentation as well. I struggled when I wanted it. Hope it does help people. Mainly, Java, Python and C#.
On Wed, Jan 10, 2024 at 1:33 PM Libin Raveendran @.***> wrote:
Thanks!. Stephen.
On Wed, 10 Jan, 2024, 1:29 pm Goldener Stefan, @.***> wrote:
Hi @R-LIBIN https://github.com/R-LIBIN accidentally stumbled upon this ticket. what you want is quite easy to do: here is an example where a "response.content" will be uploaded to sharepoint if the size is more than 4.1MB:
import os import tempfile
from azure.identity import ClientSecretCredential from msgraph.core import GraphClient
credential = ClientSecretCredential( tenant_id=tenant, client_id=client_id, client_secret=client_secret ) client = GraphClient(credential=credential) sharepoint_url = f"https://graph.microsoft.com/v1.0/drives/{driveid}/items/{itemid}:/{partner_id} - {partner_name}/{filename}"
if len(response.content) < 4100000: print("File is smaller than 4.1MB") data_upload = client.put( sharepoint_url + ":/content", data=response.content, headers={"Content-Type": "application/octet-stream"}, )
print(data_upload) print(data_upload.json())else: print("File is bigger than 4.1MB") fd, path = tempfile.mkstemp()
try: with os.fdopen(fd, "wb") as tmp: tmp.write(response.content) # Creating an upload session upload_session = client.post( sharepoint_url + ":/createUploadSession", headers={"Content-Type": "application/json"}, ).json() with open(path, "rb") as f: total_file_size = os.path.getsize(path) chunk_size = 327680 chunk_number = total_file_size // chunk_size chunk_leftover = total_file_size - chunk_size * chunk_number i = 0 while True: chunk_data = f.read(chunk_size) start_index = i * chunk_size end_index = start_index + chunk_size # If end of file, break if not chunk_data: break if i == chunk_number: end_index = start_index + chunk_leftover # Setting the header with the appropriate chunk data location in the file headers = { "Content-Type": "application/octet-stream", "Content-Length": "{}".format(chunk_size), "Content-Range": "bytes {}-{}/{}".format( start_index, end_index - 1, total_file_size ), } # Upload one chunk at a time chunk_data_upload = client.put( upload_session["uploadUrl"], data=chunk_data, headers=headers, ) print(chunk_data_upload) print(chunk_data_upload.json()) i = i + 1 except Exception as error: print(error) finally: os.remove(path)— Reply to this email directly, view it on GitHub https://github.com/microsoftgraph/msgraph-sdk-python-core/issues/137#issuecomment-1884355965, or unsubscribe https://github.com/notifications/unsubscribe-auth/AEBCZFPJMNVJ24355FOTV53YNZC65AVCNFSM57RJBOFKU5DIOJSWCZC7NNSXTN2JONZXKZKDN5WW2ZLOOQ5TCOBYGQZTKNJZGY2Q . You are receiving this because you were mentioned.Message ID: @.***>
thanks for reporting this issue, we have since released a new version of the SDK which this issue doesn't apply to. We encourage you to migrate to the new version and open a new issue if you still need help