aisuite
aisuite copied to clipboard
The error is occurring on the Groq configuration, but the Grok API key is working fine.
Traceback (most recent call last):
File "D:\ai-research-suite\aisuite_ai.py", line 23, in <module>
response = client.chat.completions.create(
model=model,
messages=messages,
temperature=0.75
)
File "D:\ai-research-suite\venv\Lib\site-packages\aisuite\client.py", line 117, in create
return provider.chat_completions_create(model_name, messages, **kwargs)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ai-research-suite\venv\Lib\site-packages\aisuite\providers\groq_provider.py", line 22, in chat_completions_create
return self.client.chat.completions.create(
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
model=model,
^^^^^^^^^^^^
messages=messages,
^^^^^^^^^^^^^^^^^^
**kwargs # Pass any additional arguments to the Groq API
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "D:\ai-research-suite\venv\Lib\site-packages\groq\resources\chat\completions.py", line 289, in create
return self._post(
~~~~~~~~~~^
"/openai/v1/chat/completions",
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<31 lines>...
stream_cls=Stream[ChatCompletionChunk],
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "D:\ai-research-suite\venv\Lib\site-packages\groq\_base_client.py", line 1225, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ai-research-suite\venv\Lib\site-packages\groq\_base_client.py", line 920, in request
return self._request(
~~~~~~~~~~~~~^
cast_to=cast_to,
^^^^^^^^^^^^^^^^
...<3 lines>...
remaining_retries=remaining_retries,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "D:\ai-research-suite\venv\Lib\site-packages\groq\_base_client.py", line 1018, in _request
raise self._make_status_error_from_response(err.response) from None
groq.AuthenticationError: Error code: 401 - {'error': {'message': 'Invalid API Key', 'type': 'invalid_request_error', 'code': 'invalid_api_key'}}
Please paste the full code on how you are loading the keys and initializing aisuite client.
Please paste the full code on how you are loading the keys and initializing aisuite client.
from dotenv import load_dotenv
import os
# Load environment variables from .env file
load_dotenv()
# Access your API keys
openai_api_key = os.getenv("OPENAI_API_KEY")
# anthropic_api_key = os.getenv("ANTHROPIC_API_KEY")
groq_api_key = os.getenv("GROQ_API_KEY")
import aisuite as ai
client = ai.Client()
models = ["groq:grok-beta"]
messages = [
{"role": "system", "content": "Respond in Pirate English."},
{"role": "user", "content": "Tell me a joke."},
]
for model in models:
response = client.chat.completions.create(
model=model,
messages=messages,
temperature=0.75
)
print(response.choices[0].message.content)
Only 2 providers work for me: OpenAI and Ollama.
I see this for Anthropic and Groq:
python -B cls1.py
Traceback (most recent call last):
File "/Users/cleesmith/aisuite/cls1.py", line 16, in <module>
response = client.chat.completions.create(model="groq:llama-3.1-8b-instant", messages=messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/cleesmith/aisuite/aisuite/client.py", line 108, in create
self.client.providers[provider_key] = ProviderFactory.create_provider(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/cleesmith/aisuite/aisuite/provider.py", line 46, in create_provider
return provider_class(**config)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/cleesmith/aisuite/aisuite/providers/groq_provider.py", line 19, in __init__
self.client = groq.Groq(**config)
^^^^^^^^^^^^^^^^^^^
File "/opt/miniconda3/envs/aisuite/lib/python3.11/site-packages/groq/_client.py", line 99, in __init__
super().__init__(
File "/opt/miniconda3/envs/aisuite/lib/python3.11/site-packages/groq/_base_client.py", line 824, in __init__
self._client = http_client or SyncHttpxClientWrapper(
^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/miniconda3/envs/aisuite/lib/python3.11/site-packages/groq/_base_client.py", line 722, in __init__
super().__init__(**kwargs)
TypeError: Client.__init__() got an unexpected keyword argument 'proxies'
for the following code, or the example code in this repo ... both have the same error:
# pip install 'aisuite[all]'
import aisuite as ai
client = ai.Client()
messages = [
{"role": "system", "content": "You are a helpful agent, who answers with brevity."},
{"role": "user", "content": 'list planets'},
]
# response = client.chat.completions.create(model="anthropic:claude-3-haiku-20240307", messages=messages)
response = client.chat.completions.create(model="groq:llama-3.1-8b-instant", messages=messages)
print(response.choices[0].message.content)
It works for openai and ollama ... using Groq is fast, free, and good for testing code so it would be nice to have.
pip show aisuite
Name: aisuite
Version: 0.1.6
Summary: Uniform access layer for LLMs
Home-page:
Author: Andrew Ng
Author-email:
License:
Location: /opt/miniconda3/envs/aisuite/lib/python3.11/site-packages
Requires:
Required-by:
related #110
@baberibrar Based on the initial error it doesn't look like the env is set properly. Do you have a .env file with the keys set? You would need the GROQ_API_KEY created in that file.
I also do not see the grok-beta as a model listed on their website (https://console.groq.com/docs/models), when I attempted to make a call using that model I received a 404.
@cleesmith What version of the groq client are you using?
Groq is working fine for me. The following models are available for inference.
import os
import aisuite as ai
aiclient = ai.Client()
os.environ['GROQ_API_KEY'] = "gsk_..."
user_input = "1+1= "
groq_models = [
"gemma2-9b-it",
"gemma-7b-it",
"llama-3.1-70b-versatile",
"llama-3.1-8b-instant",
"llama-3.2-1b-preview",
"llama-3.2-3b-preview",
"llama-3.2-11b-vision-preview",
"llama-3.2-90b-vision-preview",
"llama-guard-3-8b",
"llama3-70b-8192",
"llama3-8b-8192",
"mixtral-8x7b-32768",
]
for model in models:
print("model: ",model)
try:
response = aiclient.chat.completions.create(
model=f"groq:{model}",
messages=[
{"role": "user", "content": user_input}],
stream=True
)
# response.choices[0].message.content
for chunk in response:
data = chunk.choices[0].delta.content
if data is not None:
print(data, end='')
except Exception as e:
print(str(e))
print("\n")
@cleesmith I suspect your issue is the same as #110 which is caused by using httpx >= 0.28.0, in this version the proxies keyword arg was removed. If you pin your version to ~0.27.0 I would expect this to work for you. There is already a PR pinning the version to ~0.27.0. We'll need to do a round of updates on the dependencies for the python clients along with some testing before we can move to >= 0.28.0.