open-parse
open-parse copied to clipboard
Adding support for Azure OpenAI
Adding support for Azure OpenAI.
Usage:
embedding_client = processing.AzureOpenAIEmbeddings(
api_key="xxxxxxxxxxxxxxxxxxxxxxxxxxx",
api_endpoint="https://your-endpoint.openai.azure.com/",
deployment="text-embedding-3-large", #custom deployment names in Azure replaced the 'model' param
api_version="2024-02-15-preview" #optional: defaults to 2024-02-15-preview
)
#for the SemanticIngestionPipeline, specifying "api_endpoint" will redirect to AzureOpenAIEmbeddings
semantic_pipeline = processing.SemanticIngestionPipeline(
api_key="xxxxxxxxxxxxxxxxxxxxxxxxxxx",
api_endpoint="https://your-endpoint.openai.azure.com/",
deployment="text-embedding-3-large",
api_version="2024-02-15-preview", #optional: defaults to 2024-02-15-preview
min_tokens=64,
max_tokens=1024,
)
Thanks for the PR - this would be a pretty large breaking change. If we're going to do that, I think the best approach is to pass an embedding client directly to the the pipeline. This would allow us to support tons of different models long term