onyx
onyx copied to clipboard
google drive connector exportSizeLimitExceeded
When this problem occurs, scanning stops completely without being able to continue.
Traceback (most recent call last):
File "/app/danswer/background/indexing/run_indexing.py", line 177, in _run_indexing
for doc_batch in doc_batch_generator:
File "/app/danswer/connectors/google_drive/connector.py", line 515, in poll_source
yield from self._fetch_docs_from_drive(start, end)
File "/app/danswer/connectors/google_drive/connector.py", line 497, in _fetch_docs_from_drive
raise e
File "/app/danswer/connectors/google_drive/connector.py", line 479, in _fetch_docs_from_drive
text_contents = extract_text(file, service) or ""
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/danswer/connectors/google_drive/connector.py", line 319, in extract_text
.execute()
^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/googleapiclient/_helpers.py", line 130, in positional_wrapper
return wrapped(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/googleapiclient/http.py", line 938, in execute
raise HttpError(resp, content, uri=self.uri)
googleapiclient.errors.HttpError: <HttpError 403 when requesting https://www.googleapis.com/drive/v3/files/1St_FBwaIm0sdHNHjH0lhG14GQKj5XEonqZmzXUlH_Yk/export?mimeType=text%2Fcsv returned "This file is too large to be exported.". Details: "[{'message': 'This file is too large to be exported.', 'domain': 'global', 'reason': 'exportSizeLimitExceeded'}]">