Results 16 comments of Mayk Caldas

Apparently, scikit-learn didn't provide a wheel for python3.10 in version 1.0.1. They fixed that on version 1.0.2: https://github.com/scikit-learn/scikit-learn/issues/21511 I also tried pip installing on version 3.8.18 and had a problem...

Hello @Snikch63200 Please notice that the LLMs we use in paperqa comes from [lmi](https://github.com/Future-House/ldp/tree/main/packages/lmi), which is a wrapper over `litellm`. Of course the usage might be very similar, but there...

Hey @Snikch63200 I don't think PQA uses caching anywhere. But you're right about streaming. PQA uses LLMs from [`lmi`](https://github.com/Future-House/ldp/tree/main/packages/lmi). On lmi, you can see [an example here](https://github.com/Future-House/ldp/blob/main/packages/lmi/src/lmi/llms.py#L662) on how we...

Hello @DGoettlich , can you show a minimal repro for the issue? This check happens here: https://github.com/Future-House/paper-qa/blob/main/paperqa/docs.py#L385 Maybe your specific file is also failing on some other check.

I'm closing this issue for now. Feel free to reopen it if the problem remains!

Hey @LarsVanderwee2002 and @usathyan , That's correct, @LarsVanderwee2002 . On PaperQA, LLMs are provided by [`lmi`](https://github.com/Future-House/ldp/tree/main/packages/lmi), which uses [`litellm`](https://www.litellm.ai/) as backend. To set up a different LLM, specifically provided by...

Check [this discussion](https://github.com/Future-House/paper-qa/discussions/753). `ollama` is supported on PaperQA through `litellm`. It can be set using the `Settings` class. If you prefer using the `cli`, @CrispStrobe 's answer is correct. In...

Hey @CGH20171006 , @superlou , and @hweiske I hope you have found the solution already. As it is a recurrent question, I created [this tutorial](https://futurehouse.gitbook.io/futurehouse-cookbook/paperqa/docs/tutorials/settings_tutorial) showing how to change models...

Hello @FahaxikiHoney, Here in the README we have an example on how to use llama: https://github.com/Future-House/paper-qa?tab=readme-ov-file#locally-hosted You can either download the model locally and provide it with a local host...

Hey @SURUIYUAN and @janzheng The current paperqa version can be checked with `paperqa.__version__` I believe this is an old issue. Therefore, I'm closing it. Please feel free to reopen it...