llama-hub
llama-hub copied to clipboard
'PandasExcelReader' object has no attribute '_concat_rows'
I am trying to read an excel file with multiple sheets using llama-index. Here is my code:
from pathlib import Path
from llama_index import download_loader
PandasExcelReader = download_loader("PandasExcelReader")
loader = PandasExcelReader()
documents = loader.load_data(file=Path('dir1/excel.xlsx'),sheet_name=none)
When I run it, I get File "PycharmProjects/venv/lib/python3.11/site-packages/llama_index/readers/llamahub_modules/file/pandas_excel/base.py", line 64, in load_data if self._concat_rows: ^^^^^^^^^^^^^^^^^ AttributeError: 'PandasExcelReader' object has no attribute '_concat_rows'
My llama-index version is 0.6.0
the same mistake. Is there a solution?
loader._concat_rows = True loader._row_joiner = ''
I have looked at the source code, which can solve the problem
This should be fixed now!
Hello, I've tried the solution above but now the Loader gives me another error...
from pathlib import Path
from llama_index import download_loader
PandasExcelReader = download_loader("PandasExcelReader")
loader = PandasExcelReader()
loader._concat_rows = True
loader._row_joiner = ""
documents = loader.load_data(file=Path('./ChatCleaned.xlsx', sheet_name="Consultas1", pandas_config={"header":0}))
And the error:
7, in PandasExcelReader.load_data(self, file, sheet_name, extra_info)
64 text_list = list(itertools.chain.from_iterable(df_sheets)) # flatten list of lists
66 if self._concat_rows:
---> 67 return [Document((self._row_joiner).join(text_list), extra_info=extra_info)]
68 else:
69 return [Document(text, extra_info=extra_info) for text in text_list]
TypeError: sequence item 0: expected str instance, list found
llama_index>readers>llamahub_modules>file>pandas_excel>base.py
in line 67 ,Modify the code as follows:
if self._concat_rows:
return [Document((self._row_joiner).join(self._row_joiner.join(sublist) for sublist in text_list), extra_info=extra_info)]
else:
return [Document(text, extra_info=extra_info) for text in text_list]
It seems to be solved, closing for now