jupyter-ai
jupyter-ai copied to clipboard
Use the `/learn` command under the Azure OpenAI configuration.
Problem
I have set the parameters for Azure in the configuration page. Language model: Azure OpenAI Deployment name: gpt-35-turbo-16k Base API URL: https://xxxxxxx.openai.azure.com/ API version: 2023-06-01-preview
API Keys: xxxxxxxxxxxxxxxxxxxxxxx I am able to chat, but I cannot use the /learn command. It prompts me to configure the Embedding model. I have selected the Embedding model as OpenAI::text-embedding-ada-002, but when running the /learn command, it shows that the api-key is incorrect.
Proposed Solution
- Add the configuration option for Azure's embedding model.
- Allow users to configure the base URL and API key in the settings interface. This is beneficial as many third-party projects can integrate OpenAI or Azure OpenAI (which includes GPT3.5, GPT4, and the embedding model) by simply setting the base URL and API key in the Jupyter-AI configuration interface.
Additional context
Thank you for opening your first issue in this project! Engagement like this is essential for open source projects! :hugs:
If you haven't done so already, check out Jupyter's Code of Conduct. Also, please try to follow the issue template as it helps other other community members to contribute more effectively.
You can meet the other Jovyans by joining our Discourse forum. There is also an intro thread there where you can stop by and say Hi! :wave:
Welcome to the Jupyter community! :tada:
Also stumbling on to the same issue. Did you resolve it, @VectorZhao? Or is it still a feature request?
I'm in the same situation. Any suggestions or recommendations regarding this?
Context: I have Azure OpenAI configured as the language provider, but it currently lacks support for any embedding models. It would be beneficial to have an option to configure an embedding model specifically for Azure OpenAI.
Hi! Is there anything I can do (i.e. test) to help facilitate this? Azure OpenAI is a large provider and the only applicable one for many users - and I would highly appreciate any efforts on enabling it in the jupyter-ai! :)
I tried to add AzureOpenAIEmbeddings from langchain_openai in ./jupyter-ai/packages/jupyter-ai-magics/jupyter_ai_magics/partner_providers/openai.py
class AzureOpenAIEmbeddingsProvider(BaseEmbeddingsProvider, AzureOpenAIEmbeddings):
id = "azure"
name = "AzureOpenAI"
models = [
"text-embedding-ada-002",
"text-embedding-3-small",
"text-embedding-3-large",
]
model_id_key = "model"
pypi_package_deps = ["langchain_openai"]
auth_strategy = EnvAuthStrategy(
name="AZURE_OPENAI_API_KEY", keyword_param="openai_api_key"
)
registry = True
fields = [
TextField(key="azure_endpoint", label="Base API URL (required)", format="text"),
TextField(key="api_version", label="API version (required)", format="text"),
]
Together with the change above, i also added a line in ./jupyter-ai/packages/jupyter-ai-magics/pyproject.toml
azure = "jupyter_ai_magics.partner_providers.openai:AzureOpenAIEmbeddingsProvider"
However, I couldn't get it to show up in the jupyter-ai UI after running jlpm build and jlpm dev. I am not familar with extension development. Would really appreciate if anyone could provide some advice on how I can resolve this?
@VectorZhao / @bjornarfjelldal / @hojunhao / @eazuman / @JasonWeill / @dlqqq Can you please review https://github.com/jupyterlab/jupyter-ai/pull/940 ?
Fixed by PR #940