Error code: 422 - {'detail': [{'loc': ['body', 'user'], 'msg': 'field required', 'type': 'value_error.missing'}
I am trying to run the following code in section "LLM Extraction" in the colab demo https://colab.research.google.com/drive/1SgRPrByQLzjRfwoRNq1wSGE9nYY_EE8C?usp=sharing
async with AsyncWebCrawler(verbose=True) as crawler:
result = await crawler.arun(
url="https://openai.com/api/pricing/",
word_count_threshold=1,
extraction_strategy=LLMExtractionStrategy(
provider=provider,
api_token=api_token,
schema=OpenAIModelFee.schema(),
extraction_type="schema",
instruction="""Extract all model names along with fees for input and output tokens."
"{model_name: 'GPT-4', input_fee: 'US$10.00 / 1M tokens', output_fee: 'US$30.00 / 1M tokens'}.""",
**extra_args
),
cach_mode = CacheMode.ENABLED
)
I got this error:
[EXTRACT]. ■ Completed for https://openai.com/api/pricing/... | Time: 8.4229788130001s
[COMPLETE] ● https://openai.com/api/pricing/... | Status: True | Total: 14.31s
[{'index': 0, 'error': True, 'tags': ['error'], 'content': "litellm.BadRequestError: AzureException BadRequestError - Error code: 422 - {'detail': [{'loc': ['body', 'user'], 'msg': 'field required', 'type': 'value_error.missing'}]}"}]
I am using Azure OpenAI API with internal endpoint. I investigated on the code, I think the issue is probably because it is missing the user field as input.
Normally, if I am using Azure OpenAI with langchain, I will call the following code, it works perfectly:
from langchain_openai import AzureChatOpenAI
os.environ["AZURE_API_KEY"] = "access token"
os.environ["AZURE_API_BASE"] = "https://customized.internal_endpoint.com"
os.environ["AZURE_API_VERSION"] = "2024-07-01-preview"
llm = AzureChatOpenAI(
model="gpt-4o-mini",
model_kwargs=dict(
user=f'{{"appkey": "internal key", "user": "internal user_id"}}'
)
)
But I tried to add "user" into "extra_args" in above code, it didn't work. I investigate on the code in crawl4ai, I cannot find any way to pass the "user" field to the LiteLLM. Even for the LiteLLM library, I couldn't find the way to input the "user" field.
Anyone could help take a look there anyway to pass the "user" field" to LiteLLM? Alternatively, could I use the Azure OpenAI API or Langchain's AzureChatOpenAI directly without using LiteLLM?
@AugustusLZJ Thanks for trying Crawl4ai. Please show me the value for the provider, and I will replicate it on my site and let you know.
@unclecode Thank you for taking a look. The value of provider is "azure/gpt-4o-mini".
I guess the "user" here is likely the "user" field in "Request Body" in the following link: https://learn.microsoft.com/en-us/azure/ai-services/openai/reference
@AugustusLZJ Sure, I will check it this weekend for sure.
@unclecode Any news for this issue? Thanks.
@AugustusLZJ So many changes have happend in the library since reporting of this error. In the recent releases no issues were reported in the azure related llms. So closing this issue for now. Try with the new LLMConfig, if the issue still persists open a new issue with required details