haystack
haystack copied to clipboard
Cannot load a AzureOpenAIGenerator without installing torch
Describe the bug When you initialize AzureOpenAIGenerator, without having torch installed, it causes a runtime error
Error message
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "c:\Users\xxx\test.py", line 9, in
from haystack.components.generators import AzureOpenAIGenerator File "C:\Users\yyy\AppData\Local\Programs\Python\Python311\Lib\site-packages\haystack\components\generators_init_.py", line 5, in from haystack.components.generators.hugging_face_local import HuggingFaceLocalGenerator File "C:\Users\yyy\AppData\Local\Programs\Python\Python311\Lib\site-packages\haystack\components\generators\hugging_face_local.py", line 14, in
from transformers import StoppingCriteriaList, pipeline File "", line 1231, in _handle_fromlist File "C:\Users\yyy\AppData\Local\Programs\Python\Python311\Lib\site-packages\transformers\utils\import_utils.py", line 1372, in getattr module = self._get_module(self._class_to_module[name]) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\yyy\AppData\Local\Programs\Python\Python311\Lib\site-packages\transformers\utils\import_utils.py", line 1384, in _get_module raise RuntimeError( RuntimeError: Failed to import transformers.pipelines because of the following error (look up to see its traceback): No module named 'torch'
Expected behavior Using a AzureOpenAIGenerator, that calls external APIs, should not require you to install torch. So, it should work without torch
To Reproduce Clean install of python. Install haystack-ai and NOT torch
Run this:
BASE_URL = "https://xxx.openai.azure.com/"
API_KEY = "test"
DEPLOYMENT_NAME = "test"
import os
os.environ["AZURE_OPENAI_API_KEY"] = API_KEY
from haystack.components.generators import AzureOpenAIGenerator
llm = AzureOpenAIGenerator(azure_endpoint=BASE_URL, azure_deployment=DEPLOYMENT_NAME)
response = llm.run("What's Natural Language Processing? Be brief.")
print(response)
FAQ Check
- [x] Have you had a look at our new FAQ page?
System:
- OS: Windows
- GPU/CPU: CPU only
- Haystack version (commit or version number): 2.0.0
- DocumentStore: Not used
- Reader: Not used
- Retriever: Not used
Hello!
I can't reproduce this issue with haystack-ai==2.0.0
in a clean env.
How are you installing haystack-ai
? Please share some more details.
Hello,
I think the issue is if you have transformers, but not transformers[torch] installed
From what I see from the error log, when we import AzureOpenAIGenerator, it triggers https://github.com/deepset-ai/haystack/blob/v2.0.x/haystack/components/generators/init.py where it imports HuggingFaceLocalGenerator And in https://github.com/deepset-ai/haystack/blob/v2.0.x/haystack/components/generators/hugging_face_local.py we have: with LazyImport(message="Run 'pip install transformers[torch]'") as transformers_import: from huggingface_hub import model_info from transformers import StoppingCriteriaList, pipeline
from haystack.utils.hf import StopWordsCriteria # pylint: disable=ungrouped-imports
So, just using AzureOpenAIGenerator triggers this loading of torch, that we do not need.
I created a clean virtual environment.
I installed haystack with pip install --no-cache-dir haystack-ai
Then I executed your code, which correctly fails at generation time with
openai.AuthenticationError: Error code: 401 - {'statusCode': 401, 'message': 'Unauthorized. Access token is missing, invalid, audience is incorrect (https://cognitiveservices.azure.com), or have expired.'}
So I can't reproduce your issue. :thinking:
Hi @paulmartrencharpro I tried your code in colab and I can't replicate the error, I use AzureOpenAIGenerator
in docker and I haven't this problem.
The torch dependency is from transformers not AzureOpenAI API.
Irreproducible. Feel free to reopen if the issue persists.