azure-docs icon indicating copy to clipboard operation
azure-docs copied to clipboard

AzureOpenAIEmbeddingSkill not working :- Web Api response status: 'BadRequest', Web Api response details: '{ "error": { "message": "Too many inputs. The max number of inputs is 1. We hope to increase the number of inputs per request soon. Please contact us through an Azure support request at: https://go.microsoft.com/fwlink/?linkid=2213926 for further questions.", "type": "invalid_request_error", "param": null, "code": null } } '

Open BijitDey opened this issue 9 months ago • 2 comments

I am unable to use the AzureOpenAIEmbeddingSkill in AI Search, to embed (vectorize) a certain node of a JSON file (file containing Array of JSON) and store it in an index search field, using Indexer, as defined below: { "name": "new-indexer", "description": null, "dataSourceName": "bijitgitrag", "skillsetName": "bijit-vector-skillset", "targetIndexName": "new-index", "disabled": null, "schedule": null, "parameters": { "batchSize": null, "maxFailedItems": null, "maxFailedItemsPerBatch": null, "base64EncodeKeys": null, "configuration": { "parsingMode": "jsonArray", "allowSkillsetToReadFileData": false } }, "fieldMappings": [], "outputFieldMappings": [ { "sourceFieldName": "/document/embedding/*", "targetFieldName": "create_vector" } ], "cache": null, "encryptionKey": null }

and the corresponding skillset defined as follows : "skills": [ { "@odata.type": "#Microsoft.Skills.Text.AzureOpenAIEmbeddingSkill", "name": "#1", "description": null, "context": "/document/create_description", "resourceUri": "https://abc.openai.azure.com", "apiKey": "##########", "deploymentId": "ada-text-embeddings-002", "inputs": [ { "name": "text", "source": "/document/create_description" } ], "outputs": [ { "name": "embedding" } ], "authIdentity": null } ]

Document Details

Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.

BijitDey avatar Apr 29 '24 11:04 BijitDey

@BijitDey Thanks for your feedback! We will investigate and update as appropriate.

PesalaPavan avatar Apr 29 '24 15:04 PesalaPavan

Hi @BijitDey, from the error, it looks like the problem is with your input definition and also the context doesn't look right. I can't provide tech support on this channel, but are you parceling the document into chunks? You might need to use the split skill first.
I recommend posting to Stack Overflow. I see other posts related to AzureOpenAiEmbedding skill, and while not specifically related to your issue, it looks like the poster's context and inputs are correct, so maybe this post will unblock you: https://stackoverflow.com/questions/78079660/azure-ai-search-azureopenaiembeddingskill-concurrency-conflict-during-operation

HeidiSteen avatar Apr 29 '24 17:04 HeidiSteen

Hi @BijitDey, from the error, it looks like the problem is with your input definition and also the context doesn't look right. I can't provide tech support on this channel, but are you parceling the document into chunks? You might need to use the split skill first. I recommend posting to Stack Overflow. I see other posts related to AzureOpenAiEmbedding skill, and while not specifically related to your issue, it looks like the poster's context and inputs are correct, so maybe this post will unblock you: https://stackoverflow.com/questions/78079660/azure-ai-search-azureopenaiembeddingskill-concurrency-conflict-during-operation

Thank you @HeidiSteen, can you be specific as to what is wrong in the input definition and context, so I can understand and make the required rectification.

BijitDey avatar Apr 30 '24 05:04 BijitDey

@BijitDey, I was wrong about context. I thought I saw "create_description" in that path, but I see now that isn't the case. Another thing that caught my eye is that if you're chunking documents, your skill input syntax needs to have an * in the path. Setting all that aside, I did a web search on your error, and it's coming from AzureOpenAI, although OpenAI seems to return a version of that error too.

There are some links at the bottom of the article that offer guidance if you run into token limits. Can you take a look at those for your next step?

#please-close

HeidiSteen avatar Apr 30 '24 14:04 HeidiSteen