paper-qa
paper-qa copied to clipboard
High accuracy RAG for answering questions from scientific documents with citations
I got paper-qa correctly running with a local model. However when trying to use a paper folder, an AuthenticationError occurs, with: ```python | litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AuthenticationError: OpenAIException - Error code:...
From my testing, embedding and indexing new paper seems to be the one eating my rate limit and quota, so I am looking to use a locally hosted embedding model....
Tonight, Semantic Scholar was having some downtime, which meant I couldn't build an index. It would be nice to have some opt-in flag to allow discarding failures from `MetadataProvider`s if...
I've spent 4hrs, cannot get the quick start started. this is due to the doc is too little. this is THE most frustrating project with PROMISING paper coming out. ```...
This is an excellent project, but the manual setting of APIs seems overly complicated. The same settings do not produce the same effect across different commands, which makes it very...
We should allow for `pqa --paper_directory=~/foo`
LLM
from paperqa import Settings, ask import os os.environ["OPENAI_API_KEY"] = "EMPTY" local_llm_config = { "model_list": [ { "model_name": "ollama/llama3", "litellm_params": { "model": "ollama/llama3", "api_base": ""https://ap" } } ] } answer =...
Currently, when making an index, it may take a long time. It would be nice to have a progress bar
Currently as of https://github.com/Future-House/paper-qa/tree/v5.0.7: - Timeouts are managed by rollouts in `agents.main` - There is not a max steps, unless using `ldp` It would be nice if the `PaperQAEnvironment` itself...
The deletion currently nukes texts and docs, which is unnecessary, and you cannot undo deleting easily. Want to be able to easily remove a source, ask a question, and put...