ArxivDigest icon indicating copy to clipboard operation
ArxivDigest copied to clipboard

Can we support other LLM mode?

Open uripeled2 opened this issue 2 years ago • 1 comments

I recently published a package llm-client that can be very helpful in enabling the support to run other LLM models, including OpenAI, Google, AI21, HuggingfaceHub, Aleph Alpha, Anthropic, Local models with transformers.

uripeled2 avatar Jun 20 '23 16:06 uripeled2

So there are three things that need to be done to incorporate other models:

  1. Get the appropriate auth/api tokens to the code
  2. Change the calls in https://github.com/AutoLLM/ArxivDigest/blob/5c7340d79ba21b3c6e77510d5ca54b78dfbb0d02/src/utils.py#L108 to use the new model
  3. We need to parse the output, so the model needs to always generate responses in a certain format ({"Relevancy score": "an integer score out of 10", "Reasons for match": "1-2 sentence short reasonings"}). We've only tested this for gpt-3.5-turbo, and before we officially support any other LLM, we need to make sure that it will follow that format given the prompt in src/relevancy_prompt.txt

The last point is the main pain point, and is generally independent of the exact library we use to call the model.

rmfan avatar Jun 20 '23 17:06 rmfan