scikit-llm icon indicating copy to clipboard operation
scikit-llm copied to clipboard

Add support for Anthropic models

Open pyamin1878 opened this issue 1 year ago • 3 comments

Would be nice to use the latest and greatest Claude models with this project :)

pyamin1878 avatar Jun 27 '24 14:06 pyamin1878

Yes, I agree that Anthropic should be the first candidate for the next native backend.

FYI: you can use Claude even now with a GPT backend and a proxy server.

OKUA1 avatar Jun 27 '24 17:06 OKUA1

For people that use Ollama it would look like this (see also here):

from skllm.config import SKLLMConfig
SKLLMConfig.set_gpt_url("http://localhost:11434/v1/")
from skllm.models.gpt.classification.zero_shot import ZeroShotGPTClassifier
from skllm.datasets import get_classification_dataset


X, y = get_classification_dataset()
clf = ZeroShotGPTClassifier(model="custom_url::llama3", key="ollama")
clf.fit(X,y)
labels = clf.predict(X)

Even though we don't technically need a key, any string is still required.

Edit: Additionally /v1/embeddings is currently not supported in Ollama so you can't really use it for stuff like DynamicFewShot (I think). But they're working on a fix.

AndreasKarasenko avatar Jul 09 '24 10:07 AndreasKarasenko

FYI I fully integrated Ollama into my branch now. I need to fix the docstrings a bit and change the OllamaVectorizer but it works very nicely for me.

The main reason for the integration was that I cannot pass optional parameters (such as context size) via OpenAI, and I absolutely need that control for my work. I know you guys want to use llama.cpp instead, just figured others may be are interested in the branch.

AndreasKarasenko avatar Jul 19 '24 14:07 AndreasKarasenko