Python : #6499 Initial Commit for Mistral Connector
Motivation and Context
- Why is this changed required ? To enable Mistral Models with Semantic Kernel, there was the issue #6499 in the Backlog to add a MistralAI Connector
- What problem does it solve ? It solves the problem, that semantic kernel is not yet integrated with MistralAI
- What scenario does it contribute to? The scenario is to use different connector than HF,OpenAI or AzureOpenAI. When User's want to use Mistral they can easliy integrate it now
- If it fixes an open issue, please link to the issue here. #6499
Description
The changes made are designed by the open_ai connector, i tried to stay as close as possible to the structure. For the integration i installed the mistral python package in the repository.
I added the following Classes :
- MistrealAIChatPromptExcecutionSettings --> Responsible to administrate the Prompt Execution against MistralAI
- MistralAIChatCompletion --> Responsible to coordinate the Classes and for Content parsing
- MistralAISettings --> Basic Settings to work with the MistralAIClient
To test the changes with the tests please add MISTRALAI_API_KEY as Enviorment Variable
From a design decision i moved the processing of Functions from Connectors to the ChatCompletionClientBaseClass
What is integrated yet :
- [X] Integrate Mistral AI Chat Models without Streaming
- [X] Integrate Mistral AI Chat Models with Streaming
- [X] Simple Integration Test to test Streaming and non Streaming
- [x] Integrate MistralAI Models with Embeddings
- [x] Integrate MistralAI FunctionCalling
- [x] Extended Testing including Unit Testing & More Integration Tests
Contribution Checklist
- [ ] The code builds clean without any errors or warnings
- [ ] The PR follows the SK Contribution Guidelines and the pre-submission formatting script raises no violations
- [ ] All unit tests pass, and I have added new tests where possible
- [ ] I didn't break anyone :smile:
Python 3.10 Test Coverage Report •
File Stmts Miss Cover Missing semantic_kernel/connectors/ai/open_ai/services open_ai_chat_completion_base.py 168 70 58% 100, 104, 124, 149–153, 178, 186, 188, 192, 210–215, 233–261, 264–275, 290–297, 308–316, 332–339, 360, 368, 374–377, 389–392, 423 semantic_kernel/functions kernel_function_from_prompt.py 154 7 95% 165–166, 180, 200, 218, 238, 321 TOTAL 6686 781 88%
Python 3.10 Unit Test Overview
| Tests | Skipped | Failures | Errors | Time |
|---|---|---|---|---|
| 1567 | 1 :zzz: | 0 :x: | 0 :fire: | 18.502s :stopwatch: |
@microsoft-github-policy-service agree
Can you please comment on the following work item so that I can assign it to you? Thank you.
With seeing this #7035 here, it looks like @TaoChenOSU is also going for an implementation of the function calling inside the connector. If we both try to implement shared Function Calling Code, this will probably result into merge conflicts.
I think i will close this and will submit a new PR which is very lightweight and easy to understand for the Mistral Connector.
And then we can define a new Issue if we want to merge the FunctionCalling Flow of all connectors.