Add support for Anthropic models
Would be nice to use the latest and greatest Claude models with this project :)
Yes, I agree that Anthropic should be the first candidate for the next native backend.
FYI: you can use Claude even now with a GPT backend and a proxy server.
For people that use Ollama it would look like this (see also here):
from skllm.config import SKLLMConfig
SKLLMConfig.set_gpt_url("http://localhost:11434/v1/")
from skllm.models.gpt.classification.zero_shot import ZeroShotGPTClassifier
from skllm.datasets import get_classification_dataset
X, y = get_classification_dataset()
clf = ZeroShotGPTClassifier(model="custom_url::llama3", key="ollama")
clf.fit(X,y)
labels = clf.predict(X)
Even though we don't technically need a key, any string is still required.
Edit: Additionally /v1/embeddings is currently not supported in Ollama so you can't really use it for stuff like DynamicFewShot (I think). But they're working on a fix.
FYI I fully integrated Ollama into my branch now. I need to fix the docstrings a bit and change the OllamaVectorizer but it works very nicely for me.
The main reason for the integration was that I cannot pass optional parameters (such as context size) via OpenAI, and I absolutely need that control for my work. I know you guys want to use llama.cpp instead, just figured others may be are interested in the branch.