Error with anthropic_manifold_pipeline
Bug Report
Description
Bug Summary:
When using Claude 3.5 via anthropic_manifold_pipeline.py from https://github.com/open-webui/pipelines/blob/main/examples/pipelines/providers/anthropic_manifold_pipeline.py, I get the following error after a while. The API Key is correct.
Steps to Reproduce:
I deployed Pipelines via docker and set up the API Key via the admin interface.
After using Claude 3.5 the pipelines container prints the following error after 2-3 uses.
INFO: 192.168.0.4:59372 - "GET /models HTTP/1.1" 200 OK
INFO: 192.168.0.4:57282 - "GET /models HTTP/1.1" 200 OK
INFO: 192.168.0.4:36000 - "POST /anthropic.claude-3-5-sonnet-20240620/filter/inlet HTTP/1.1" 200 OK
anthropic.claude-3-5-sonnet-20240620
anthropic.claude-3-5-sonnet-20240620
INFO: 192.168.0.4:36014 - "POST /chat/completions HTTP/1.1" 200 OK
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 435, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
await super().__call__(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
await self.middleware_stack(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
raise exc
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
await self.app(scope, receive, _send)
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
with collapse_excgroups():
File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
self.gen.throw(typ, value, traceback)
File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
raise exc
File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 261, in wrap
await func()
File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 250, in stream_response
async for chunk in self.body_iterator:
File "/usr/local/lib/python3.11/site-packages/starlette/concurrency.py", line 65, in iterate_in_threadpool
yield await anyio.to_thread.run_sync(_next, as_iterator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 859, in run
result = context.run(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/starlette/concurrency.py", line 54, in _next
return next(iterator)
^^^^^^^^^^^^^^
File "/app/main.py", line 665, in stream_content
for line in res:
File "/app/./pipelines/anthropic_manifold_pipeline.py", line 141, in stream_response
stream = self.client.messages.create(**payload)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/_utils/_utils.py", line 277, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/resources/messages.py", line 904, in create
return self._post(
^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/_base_client.py", line 1266, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/_base_client.py", line 942, in request
return self._request(
^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/_base_client.py", line 1046, in _request
raise self._make_status_error_from_response(err.response) from None
anthropic.AuthenticationError: Error code: 401 - {'type': 'error', 'error': {'type': 'authentication_error', 'message': 'invalid x-api-key'}}
Environment
-
Open WebUI Version: [e.g., 0.1.120] v0.3.10
-
Operating System: [e.g., Windows 10, macOS Big Sur, Ubuntu 20.04] Server: Latest Docker, x64 Client: Latest macOS Sonoma, ARM64
-
Browser (if applicable): [e.g., Chrome 100.0, Firefox 98.0] Latest Firefox, ARM64
Confirmation:
- [X] I have read and followed all the instructions provided in the README.md.
- [X] I am on the latest version of both Open WebUI and Ollama.
- [X] I have included the browser console logs.
- [X] I have included the Docker container logs.
We often experience the same thing.
I'll have a look when I've got time, but I've been mostly using my Anthropic Function nowadays. I haven't noticed this issue in that, but then it's just using requests and not using the anthropic library actually, so there could be differences there.
Any solution yet? Today I have the same problem.
RESET_PIPELINES_DIR is not set to true. No action taken.
PIPELINES_REQUIREMENTS_PATH not specified. Skipping installation of requirements.
PIPELINES_URLS not specified. Skipping pipelines download and installation.
INFO: Started server process [7]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:9099 (Press CTRL+C to quit)
Loaded module: anthropic_manifold_pipeline
Loaded module: openai_dalle_manifold_pipeline
on_startup:anthropic_manifold_pipeline
on_startup:openai_dalle_manifold_pipeline
INFO: 192.168.0.4:52276 - "GET /models HTTP/1.1" 200 OK
RESET_PIPELINES_DIR is not set to true. No action taken.
PIPELINES_REQUIREMENTS_PATH not specified. Skipping installation of requirements.
PIPELINES_URLS not specified. Skipping pipelines download and installation.
INFO: Started server process [7]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:9099 (Press CTRL+C to quit)
Loaded module: anthropic_manifold_pipeline
Loaded module: openai_dalle_manifold_pipeline
on_startup:anthropic_manifold_pipeline
on_startup:openai_dalle_manifold_pipeline
INFO: 192.168.0.4:59008 - "GET /models HTTP/1.1" 200 OK
INFO: 192.168.0.4:60898 - "GET /models HTTP/1.1" 200 OK
INFO: 192.168.0.4:60940 - "GET /models HTTP/1.1" 200 OK
INFO: 192.168.0.4:51132 - "POST /anthropic.claude-3-5-sonnet-20240620/filter/inlet HTTP/1.1" 200 OK
anthropic.claude-3-5-sonnet-20240620
anthropic.claude-3-5-sonnet-20240620
INFO: 192.168.0.4:51134 - "POST /chat/completions HTTP/1.1" 200 OK
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 435, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
await super().__call__(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
await self.middleware_stack(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
raise exc
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
await self.app(scope, receive, _send)
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
with collapse_excgroups():
File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
self.gen.throw(typ, value, traceback)
File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
raise exc
File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 261, in wrap
await func()
File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 250, in stream_response
async for chunk in self.body_iterator:
File "/usr/local/lib/python3.11/site-packages/starlette/concurrency.py", line 65, in iterate_in_threadpool
yield await anyio.to_thread.run_sync(_next, as_iterator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 859, in run
result = context.run(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/starlette/concurrency.py", line 54, in _next
return next(iterator)
^^^^^^^^^^^^^^
File "/app/main.py", line 665, in stream_content
for line in res:
File "/app/./pipelines/anthropic_manifold_pipeline.py", line 141, in stream_response
stream = self.client.messages.create(**payload)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/_utils/_utils.py", line 277, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/resources/messages.py", line 904, in create
return self._post(
^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/_base_client.py", line 1266, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/_base_client.py", line 942, in request
return self._request(
^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/_base_client.py", line 1046, in _request
raise self._make_status_error_from_response(err.response) from None
anthropic.AuthenticationError: Error code: 401 - {'type': 'error', 'error': {'type': 'authentication_error', 'message': 'invalid x-api-key'}}
INFO: 192.168.0.4:52738 - "POST /anthropic.claude-3-5-sonnet-20240620/filter/inlet HTTP/1.1" 200 OK
anthropic.claude-3-5-sonnet-20240620
anthropic.claude-3-5-sonnet-20240620
INFO: 192.168.0.4:52740 - "POST /chat/completions HTTP/1.1" 200 OK
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 435, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
await super().__call__(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
await self.middleware_stack(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
raise exc
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
await self.app(scope, receive, _send)
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
with collapse_excgroups():
File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
self.gen.throw(typ, value, traceback)
File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
raise exc
File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 261, in wrap
await func()
File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 250, in stream_response
async for chunk in self.body_iterator:
File "/usr/local/lib/python3.11/site-packages/starlette/concurrency.py", line 65, in iterate_in_threadpool
yield await anyio.to_thread.run_sync(_next, as_iterator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 859, in run
result = context.run(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/starlette/concurrency.py", line 54, in _next
return next(iterator)
^^^^^^^^^^^^^^
File "/app/main.py", line 665, in stream_content
for line in res:
File "/app/./pipelines/anthropic_manifold_pipeline.py", line 141, in stream_response
stream = self.client.messages.create(**payload)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/_utils/_utils.py", line 277, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/resources/messages.py", line 904, in create
return self._post(
^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/_base_client.py", line 1266, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/_base_client.py", line 942, in request
return self._request(
^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/_base_client.py", line 1046, in _request
raise self._make_status_error_from_response(err.response) from None
anthropic.AuthenticationError: Error code: 401 - {'type': 'error', 'error': {'type': 'authentication_error', 'message': 'invalid x-api-key'}}
401 Authentication error. Have you checked your keys work if you do a request manually with cURL?
The API Key is correct.
See my initial post.
Your logs are either at odds with the reality, or your key is bad. You didn't answer the question of whether you've tried that key with anything else. You're certain your account is in good-standing, and adequately funded ($5 minimum balance required IIRC)?
If I deliberately set an incorrect key I get this error, same as you:
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 435, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
await super().__call__(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
await self.middleware_stack(scope, receive, send)
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
raise exc
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
await self.app(scope, receive, _send)
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
with collapse_excgroups():
File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
self.gen.throw(typ, value, traceback)
File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
raise exc
File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 261, in wrap
await func()
File "/usr/local/lib/python3.11/site-packages/starlette/responses.py", line 250, in stream_response
async for chunk in self.body_iterator:
File "/usr/local/lib/python3.11/site-packages/starlette/concurrency.py", line 65, in iterate_in_threadpool
yield await anyio.to_thread.run_sync(_next, as_iterator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2177, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 859, in run
result = context.run(func, *args)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/starlette/concurrency.py", line 54, in _next
return next(iterator)
^^^^^^^^^^^^^^
File "/app/main.py", line 665, in stream_content
for line in res:
File "/app/./pipelines/anthropic_manifold_pipeline.py", line 141, in stream_response
stream = self.client.messages.create(**payload)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/_utils/_utils.py", line 277, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/resources/messages.py", line 904, in create
return self._post(
^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/_base_client.py", line 1266, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/_base_client.py", line 942, in request
return self._request(
^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/_base_client.py", line 1046, in _request
raise self._make_status_error_from_response(err.response) from None
anthropic.AuthenticationError: Error code: 401 - {'type': 'error', 'error': {'type': 'authentication_error', 'message': 'invalid x-api-key'}}
Your logs are either at odds with the reality, or your key is bad. You didn't answer the question of whether you've tried that key with anything else. You're certain your account is in good-standing, and adequately funded ($5 minimum balance required IIRC)?
25$ Balance
Here is the cURL Example with output
curl https://api.anthropic.com/v1/messages --header "x-api-key: MYAPIKEY" --header "anthropic-version: 2023-06-01" --header "content-type: application/json" --data '{"model": "claude-3-5-sonnet-20240620", "max_tokens": 1024, "messages": [{"role": "user", "content": "Hello, world"}]}'
{"id":"msg_018VV4ca8WAiR8iTgSfMcx3d","type":"message","role":"assistant","model":"claude-3-5-sonnet-20240620","content":[{"type":"text","text":"Hello! How can I assist you today? Is there anything specific you'd like to talk about or any questions you have?"}],"stop_reason":"end_turn","stop_sequence":null,"usage":{"input_tokens":10,"output_tokens":28}}
Well, I'm truly at a loss then, I've been unable to reproduce this by any easy means, and I don't wish to push the issue too much further for fear of affecting my own access to their API.
If I create a second API key and use it, it works. The first API key, however, functions flawlessly via cURL.
Strange... is it possible they're using a weird character in your first key that isn't being escaped properly?
Sometimes it's working with the key and after some time the same key won't work anymore.
I occasionally see the same issue with a working key. Seems reproducible when I restart the pipelines container. If I go into Open WebUI/Pipelines and select the anthropic_manifold_pipeline (manifold), the saved API key loads up. After pressing save without changing anything, it starts working as it should. The mounted appdata folder has the correct file/folder permissions and ownership.
PS: I have to do the same steps for openai_manifold_pipeline (manifold) for OpenAI models to load up and be usable.
Now there's a clue I can work on. Thanks @realies, I'll see if this new information uncovers anything.
I am also experiencing this same error. With curl, the key works fine before I use it with Pipelines, but after I use it in Open WebUI, like a banshee, the key is marked for death (figuratively speaking) and will unceremoniously expire in <24 hours. After this, not even curl can revive the key. I have made dozens of new keys trying to get around this issue and I have not been banned from the API or anything, but my keys are dropping like flies.
This is the error that happens after a key has been cursed:
{
"type": "error",
"error": {
"type": "authentication_error",
"message": "invalid x-api-key"
}
}
If you're paying customers, have any of you tried contacting Anthropic's support to inquire as to the reason your keys are being revoked?
I have, and Anthropic took the entirety of the month to respond (Thanks, Claude 3.5). Today, they're asking for what the header sent from the app looks like and I don't really know how to find that out.
Thanks for the additional datapoint @Digit404. I've no intention of closing this until I figure out what the heck is going on 😅
While this issue is being looked at, anyone who wants to temporarily bypass this problem can use the following working anthropic pipeline which uses the HTTP API directly and is currently working without any errors (no invalid x-api-key error):
"""
title: Anthropic Manifold HTTP Pipeline
author: sriparashiva
date: 2024-07-27
version: 1.0
license: MIT
description: A pipeline for generating text and processing images using the Anthropic HTTP API.
requirements: requests, sseclient-py
environment_variables: ANTHROPIC_API_KEY
"""
import os
from typing import List, Union, Generator, Iterator
from pydantic import BaseModel
import requests
from utils.pipelines.main import pop_system_message
import requests
import sseclient
from typing import Generator
import json
class Pipeline:
class Valves(BaseModel):
ANTHROPIC_API_KEY: str = ""
def __init__(self):
self.type = "manifold"
self.id = "anthropic"
self.name = "anthropic/"
self.valves = self.Valves(
**{"ANTHROPIC_API_KEY": os.getenv("ANTHROPIC_API_KEY", "your-api-key-here")}
)
self.api_key = self.valves.ANTHROPIC_API_KEY
self.url = 'https://api.anthropic.com/v1/messages'
self.headers = {
'anthropic-version': '2023-06-01',
'content-type': 'application/json',
'x-api-key': self.api_key
}
def get_anthropic_models(self):
return [
{"id": "claude-3-haiku-20240307", "name": "claude-3-haiku"},
{"id": "claude-3-opus-20240229", "name": "claude-3-opus"},
{"id": "claude-3-sonnet-20240229", "name": "claude-3-sonnet"},
{"id": "claude-3-5-sonnet-20240620", "name": "claude-3.5-sonnet"},
]
async def on_startup(self):
print(f"on_startup:{__name__}")
pass
async def on_shutdown(self):
print(f"on_shutdown:{__name__}")
pass
async def on_valves_updated(self):
self.api_key = self.valves.ANTHROPIC_API_KEY
self.headers = {
'anthropic-version': '2023-06-01',
'content-type': 'application/json',
'x-api-key': self.api_key
}
pass
def pipelines(self) -> List[dict]:
return self.get_anthropic_models()
def process_image(self, image_data):
if image_data["url"].startswith("data:image"):
mime_type, base64_data = image_data["url"].split(",", 1)
media_type = mime_type.split(":")[1].split(";")[0]
return {
"type": "image",
"source": {
"type": "base64",
"media_type": media_type,
"data": base64_data,
},
}
else:
return {
"type": "image",
"source": {"type": "url", "url": image_data["url"]},
}
def pipe(
self, user_message: str, model_id: str, messages: List[dict], body: dict
) -> Union[str, Generator, Iterator]:
try:
# Remove unnecessary keys
for key in ['user', 'chat_id', 'title']:
body.pop(key, None)
system_message, messages = pop_system_message(messages)
processed_messages = []
image_count = 0
total_image_size = 0
for message in messages:
processed_content = []
if isinstance(message.get("content"), list):
for item in message["content"]:
if item["type"] == "text":
processed_content.append({"type": "text", "text": item["text"]})
elif item["type"] == "image_url":
if image_count >= 5:
raise ValueError("Maximum of 5 images per API call exceeded")
processed_image = self.process_image(item["image_url"])
processed_content.append(processed_image)
if processed_image["source"]["type"] == "base64":
image_size = len(processed_image["source"]["data"]) * 3 / 4
else:
image_size = 0
total_image_size += image_size
if total_image_size > 100 * 1024 * 1024:
raise ValueError("Total size of images exceeds 100 MB limit")
image_count += 1
else:
processed_content = [{"type": "text", "text": message.get("content", "")}]
processed_messages.append({"role": message["role"], "content": processed_content})
# Prepare the payload
payload = {
"model": model_id,
"messages": processed_messages,
"max_tokens": body.get("max_tokens", 4096),
"temperature": body.get("temperature", 0.8),
"top_k": body.get("top_k", 40),
"top_p": body.get("top_p", 0.9),
"stop_sequences": body.get("stop", []),
**({"system": str(system_message)} if system_message else {}),
"stream": body.get("stream", False),
}
if body.get("stream", False):
return self.stream_response(model_id, payload)
else:
return self.get_completion(model_id, payload)
except Exception as e:
return f"Error: {e}"
def stream_response(self, model_id: str, payload: dict) -> Generator:
response = requests.post(self.url, headers=self.headers, json=payload, stream=True)
if response.status_code == 200:
client = sseclient.SSEClient(response)
for event in client.events():
event_data = event.data
try:
event_json = json.loads(event_data)
if event_json['type'] == "content_block_start":
yield event_json['content_block']['text']
elif event_json['type'] == "content_block_delta":
yield event_json['delta']['text']
except ValueError:
# Handle any JSON decoding errors if necessary
print(f"Unable to decode JSON: {event_data}")
raise ValueError
else:
raise Exception(f"Error: {response.status_code} - {response.text}")
def get_completion(self, model_id: str, payload: dict) -> str:
response = requests.post(self.url, headers=self.headers, json=payload)
if response.status_code == 200:
response_json = response.json()
# Assuming we're interested in the first text content block.
return response_json['content'][0]['text']
else:
raise Exception(f"Error: {response.status_code} - {response.text}")
@sriparashiva ironically that's exactly what I was planning to do to "resolve" this if I had to, since using requests has been working just fine on my Function version of this: https://openwebui.com/f/justinrahb/anthropic
While this issue is being looked at, anyone who wants to temporarily bypass this problem can use the following working anthropic pipeline which uses the HTTP API directly and is currently working without any errors (no invalid x-api-key error.
Hi, thank you. I have downloaded your script and uploaded it through the OpenWebUI interface. Afterwards, the file is immediately moved to the "failed" subfolder within the pipeline folder. In the pipeline docker Logs, I see the following:
WARNING:root:No Pipeline class found in Anthropic
Error loading module: Anthropic
No module named 'sseclient'
@dannykorpan Since the pipeline has an additional dependency sseclient, you need to install it manually on the pipelines python environment. This currently needs to be done manually (I couldn't find any existing documentation showing how to have the dependencies of a custom pipeline auto-installed).
I am using a Docker setup for pipelines, so I just executed the command pip3 install sseclient-py inside the pipelines docker container. That should make it work.
Or alternatively, you can use the code snippet from @justinh-rahb's linked custom function which does not use any external dependencies.
@dannykorpan Since the pipeline has an additional dependency
sseclient, you need to install it manually on the pipelines python environment. This currently needs to be done manually (I couldn't find any existing documentation showing how to have the dependencies of a custom pipeline auto-installed).I am using a Docker setup for pipelines, so I just executed the command
pip3 install sseclient-pyinside the pipelines docker container. That should make it work.Or alternatively, you can use the code snippet from @justinh-rahb's linked custom function which does not use any external dependencies.
Your workaround works :-) Thank you!
@dannykorpan @Digit404 @sriparashiva please give this PR a try:
- #179
If you install it by putting the raw URL into your PIPELINES_URLS environment variable, the server will pull it and install dependency sseclient-py automatically on startup:
https://raw.githubusercontent.com/justinh-rahb/open-webui-pipelines/anthropic-fix/examples/pipelines/providers/anthropic_manifold_pipeline.py
Thank you, but still get the following error after importing into pipelines:
WARNING:root:No Pipeline class found in anthropic_manifold_pipeline
https://raw.githubusercontent.com/justinh-rahb/open-webui-pipelines/anthropic-fix/examples/pipelines/providers/anthropic_manifold_pipeline.py
Error loading module: anthropic_manifold_pipeline
No module named 'sseclient'
Thank you, but still get the following error after importing into pipelines:
WARNING:root:No Pipeline class found in anthropic_manifold_pipeline https://raw.githubusercontent.com/justinh-rahb/open-webui-pipelines/anthropic-fix/examples/pipelines/providers/anthropic_manifold_pipeline.py Error loading module: anthropic_manifold_pipeline No module named 'sseclient'
@dannykorpan as noted, the dependency only installs on server startup if it is already in the pipelines/ dir or you've added the URL to PIPELINES_URLS, if you attempt to upload the pipeline from Admin Settings > Pipelines, you'll get such an error.
Thank you, but I think it represents a potential source of error.
If I restart the server, the sseclient module still won't be installed automatically. I always have to install it manually, even after a restart. Or am I misunderstanding something?
You are misunderstanding something, yes. The frontmatter at the top of the file specifies what depends it uses:
requirements: requests, sseclient-py
When the server is started (or restarted) with start.sh, it loops through all of the installed pipelines and will pip install their requirements:
https://github.com/open-webui/pipelines/blob/cd6c092a531f886810f2789066a628907e1d2478/start.sh#L85-L105
But I start the pipelines server via docker.
But I start the pipelines server via docker.
Two options:
Bind mount the pipelines volume:
docker run -d \
-p 9099:9099 \
--add-host=host.docker.internal:host-gateway \
-v /home/your-username/pipelines:/app/pipelines \
--name pipelines \
--restart always \
ghcr.io/open-webui/pipelines:main
You can then drop the anthropic_manifold_pipeline.py file in the pipelines dir, restart the container, and it should install.
Alternatively, use the PIPELINES_URLS environment variable:
docker run -d \
-p 9099:9099 \
--add-host=host.docker.internal:host-gateway \
-v pipelines:/app/pipelines \
-e PIPELINES_URLS="github_raw_urls;separated_by_semicolons"
--name pipelines \
--restart always \
ghcr.io/open-webui/pipelines:main
This should install the depends immediately after downloading the pipeline when the container starts.
Works, thank you!