JINO ROHIT
JINO ROHIT
ive tried prompt engineering but its still kinda bad, i tried to add a threshold for the filtered source nodes but that didnt work either, a second llm call is...
guess ill try faithfulness eval
i tried add a post context prompt and it worked fairly well, thanks
@Aekansh-Ak hey sure, what i did was - post_prompt = """ if the information isnt available in the given context to formulate the answer, just reply with NO ANSWER"""" response...
@mariosasko can i take this up?
#self-assign
i think its working as expected . Heres the log i get for the same line - 
@amyeroberts can i take this up?
same issue, has this been resolved by someone?