Using paper-qa with GROQ LLMs ?
I would like to use paper-qa with GROQ LLMs. Is this possible and if so, how exactly to do that?
For reference see:
https://console.groq.com/docs/models
Etc.
Also for Python code examples using GROQ LLMs.
Hey @jbdatascience! I'm here to help you with any bugs, questions, or contributions you have. Let's tackle this together!
I couldn't find specific information on integrating paper-qa with GROQ LLMs. However, paper-qa is designed to be configurable with different language models through its Settings class. You can customize the LLM configuration by setting parameters like llm, llm_config, summary_llm, and summary_llm_config to specify the models and configurations you want to use [1].
To integrate a non-default LLM like GROQ, you would typically need to ensure that the LLM is compatible with the interfaces expected by paper-qa. This might involve implementing or adapting an interface that allows paper-qa to communicate with the GROQ LLM, similar to how it interacts with other LLMs via LiteLLM.
If GROQ LLMs have a specific API or SDK, you would need to configure paper-qa to use that API for embedding, summarization, and answer generation tasks. This might require custom code to handle the specifics of the GROQ LLM API.
To continue talking to Dosu, mention @dosu.
Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other | Bug Report
Hi @jbdatascience yeah we use LiteLLM as our LLM layer. Check this: https://docs.litellm.ai/docs/providers/groq
Our docs here are pretty suboptimal, but hopefully you can get 90% of the way there by reading our issues and docs. Basically you'll want to configure the LLM stuff here to align with Groq.
Please feel free to let us know where you get stuck and can try to help.