[Bug]: <Error Invoking LLM>The response was filtered due to the prompt triggering Azure OpenAI's content management policy. Please modify your prompt and retry
Do you need to file an issue?
- [x] I have searched the existing issues and this bug is not already filed.
- [ ] My model is hosted on OpenAI or Azure. If not, please look at the "model providers" issue and don't file a new one here.
- [x] I believe this is a legitimate bug, not just a question. If this is a question, please use the Discussions area.
Describe the bug
{ "type": "error", "data": "Error Invoking LLM", "stack": "Traceback (most recent call last):\n File "C:\Users\17334\anaconda3\envs\graphrag\lib\site-packages\fnllm\base\base_llm.py", line 144, in call\n return await self._decorated_target(prompt, **kwargs)\n File "C:\Users\17334\anaconda3\envs\graphrag\lib\site-packages\fnllm\base\services\json.py", line 78, in invoke\n return await delegate(prompt, **kwargs)\n File "C:\Users\17334\anaconda3\envs\graphrag\lib\site-packages\fnllm\base\services\cached.py", line 115, in invoke\n result = await delegate(prompt, **kwargs)\n File "C:\Users\17334\anaconda3\envs\graphrag\lib\site-packages\fnllm\base\services\rate_limiter.py", line 75, in invoke\n result = await delegate(prompt, **args)\n File "C:\Users\17334\anaconda3\envs\graphrag\lib\site-packages\fnllm\base\base_llm.py", line 126, in _decorator_target\n output = await self._execute_llm(prompt, kwargs)\n File "C:\Users\17334\anaconda3\envs\graphrag\lib\site-packages\fnllm\openai\llm\openai_text_chat_llm.py", line 166, in _execute_llm\n completion = await self._client.chat.completions.create(\n File "C:\Users\17334\anaconda3\envs\graphrag\lib\site-packages\openai\resources\chat\completions\completions.py", line 1927, in create\n return await self._post(\n File "C:\Users\17334\anaconda3\envs\graphrag\lib\site-packages\openai\_base_client.py", line 1767, in post\n return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)\n File "C:\Users\17334\anaconda3\envs\graphrag\lib\site-packages\openai\_base_client.py", line 1461, in request\n return await self._request(\n File "C:\Users\17334\anaconda3\envs\graphrag\lib\site-packages\openai\_base_client.py", line 1562, in _request\n raise self._make_status_error_from_response(err.response) from None\nopenai.BadRequestError: Error code: 400 - {'error': {'message': "The response was filtered due to the prompt triggering Azure OpenAI's content management policy. Please modify your prompt and retry. To learn more about our content filtering policies please read our documentation: https://go.microsoft.com/fwlink/?linkid=2198766 (request id: 20250310230949191144114y9XhQGWZ)", 'type': '', 'param': 'prompt', 'code': 'content_filter'}}\n", "source": "Error code: 400 - {'error': {'message': "The response was filtered due to the prompt triggering Azure OpenAI's content management policy. Please modify your prompt and retry. To learn more about our content filtering policies please read our documentation: https://go.microsoft.com/fwlink/?linkid=2198766 (request id: 20250310230949191144114y9XhQGWZ)", 'type': '', 'param': 'prompt', 'code': 'content_filter'}}", "details": { "prompt": "\nYou are a helpful assistant responsible for generating a comprehensive summary of the data provided below.\nGiven one or two entities, and a list of descriptions, all related to the same entity or group of entities.\nPlease concatenate all of these into a single, comprehensive description. Make sure to include information collected from all the descriptions.\nIf the provided descriptions are contradictory, please resolve the contradictions and provide a single, coherent summary.\nMake sure it is written in third person, and include the entity names so we have the full context.\n\n#######\n-Data-\nEntities: ["SEA OF MARMARA", "MUCILAGE EVENT"]\nDescription List: ["The Sea of Marmara is the site of mucilage formation, an ecological challenge", "The Sea of Marmara was the location of the mucilage event in October 2007\nThe Sea of Marmara was the location of the dense mucilage formation in October 2007", "The mucilage event occurred in the Sea of Marmara in October 2007, severely impacting its ecosystem", "The mucilage event occurred in the Sea of Marmara in the autumn of 2007 and caused significant environmental and economic impact", "The mucilage event was a major ecological phenomenon in the Sea of Marmara, driven by diatom and dinoflagellate activity"]\n#######\nOutput:\n", "kwargs": { "name": "summarize", "model_parameters": { "max_tokens": 500 } } } }
Steps to reproduce
No response
Expected Behavior
No response
GraphRAG Config Used
# Paste your config here
Logs and screenshots
No response
Additional Information
- GraphRAG Version:
- Operating System:
- Python Version:
- Related Issues: