llmware icon indicating copy to clipboard operation
llmware copied to clipboard

Issue while ingesting JSON files

Open HarshalPal opened this issue 1 year ago • 2 comments

I'm facing an issue whenever I try to create library for JSON files. The error reads as follows:

ERROR:root:error: to use prompt_with_source, there must be a loaded source - try '.add_sources' first Traceback (most recent call last): File "/home/user/Documents/LLMware/examples/Getting_Started/getting_started_with_rag.py", line 70, in end_to_end_rag(model_name, vector_db="faiss") File "/home/user/Documents/LLMware/examples/Getting_Started/getting_started_with_rag.py", line 49, in end_to_end_rag print ("\n > LLM response\n" + response["llm_response"]) KeyError: 'llm_response'

Any leads in solving them will be helpful.

HarshalPal avatar Dec 27 '23 10:12 HarshalPal

Most likely, there is a parsing issue with the .json file, so it is not returning any parsed content, which is then triggering the breaking error when you call prompt_with_sources. The breaking error was fixed in 0.1.13 and is up-to-date in the main branch, so you may want to pull the updated version - it will progress without error, but will send a logging warning that there is no source attached to the prompt.

To check if there is an issue with parsing the .JSON file, you can call the Parser directly, e.g., Parser().parse_one_text("/path/to/file", "filename") and that should give you an update with a list of dictionaries of the parsed content. If there is a problem, then we know what it is an issue in the json parser.

doberst avatar Dec 27 '23 11:12 doberst

You mean I need to upgrade the llmware package to the latest version (0.1.12 --> 0.1.14) and proceed as usual, right?

HarshalPal avatar Jan 01 '24 07:01 HarshalPal

@HarshalPal - confirming YES... sorry I missed your reply. Closing out this issue as it was resolved in 0.1.14.

doberst avatar Jan 28 '24 13:01 doberst