feat: add support for OpenAI
Thanks for the feedback! I'm working on it π«‘
@fmaclen Just to be sure: should we get rid of all the 'connected' or 'disconnected' logic? For example, in the settings page, should the status badge be removed?
should we get rid of all the 'connected' or 'disconnected' logic?
@GregoMac1 let's keep that functionality only in the Settings view, it's still useful to know if we are correctly connected to an Ollama server.
Remaining tasks:
- [x] Match types between both APIs (that should fix the linting errors).
- [x] Check if the OpenAI strategy works correctly with a real API key.
- [x] Add missing tanslations for
es,ja, andtr. - [x] Fix existing tests.
- [x] Add new tests.
The only bug I found was that after choosing an OpenAI model (i.e. gpt-3.5-turbo) and submitting a prompt, I was expecting it to show up in the Recently used models list. It did show up later after I left the Sessions page and returned to it.
@GregoMac1 I'm still seeing this issue.
Here's a test of tests cases:
- Fetching data from OpenAI with a correct API Key
- Trying to fetch data with an incorrect API key
- Trying to fetch data with a network connection error
- Can't send any fetch requests without an
apiKeyorbaseUrlset - Check the sorting/order of the models list is correct
- Once an OpenAI model is used it gets added to the "recently used" list
- An OpenAI model can be used in a session and that the model is saved to the localStorage for that specific session
- In
FieldModelSelect:- Check only the
gptfiltered models are available - Check that Ollama models have an
ollamabadge - Check that OpenAI models have an
openaibadge
- Check only the
The only bug I found was that after choosing an OpenAI model (i.e. gpt-3.5-turbo) and submitting a prompt, I was expecting it to show up in the Recently used models list. It did show up later after I left the Sessions page and returned to it.
@fmaclen This should be fixed now!
I've fixed the remaining bugs. I'm going to continue with the tests now.
Deploying hollama with Β
Β Cloudflare Pages
| Latest commit: |
330658e
|
| Status: | Β β Β Deploy successful! |
| Preview URL: | https://18201545.hollama.pages.dev |
| Branch Preview URL: | https://94-support-openai-api-format.hollama.pages.dev |
@GregoMac1 I get these 2 tests failing:
openai.test.ts:56:2 βΊ OpenAI Integration βΊ models list is sorted correctly
session.test.ts:720:2 βΊ Session βΊ handles errors when fetching models
This one I'm not entirely sure why it's failing, ~~looks like a race condition~~ (it wasn't, here's why it fails): trace.zip
1) openai.test.ts:56:2 βΊ OpenAI Integration βΊ models list is sorted correctly ββββββββββββββββββββ
Test timeout of 5000ms exceeded.
Error: expect(locator).toHaveText(expected)
Locator: locator('div[role="option"]').first()
Expected string: "gpt-3.5-turbo"
Received string: " codegemma:latest 9B "
Call log:
- expect.toHaveText with timeout 5000ms
- waiting for locator('div[role="option"]').first()
- locator resolved to <div role="option" id="yfbQhZM0We" aria-selected="falβ¦>β¦</div>
- unexpected value " codegemma:latest 9B "
- locator resolved to <div role="option" id="yfbQhZM0We" aria-selected="falβ¦>β¦</div>
- unexpected value " codegemma:latest 9B "
61 |
62 | const modelOptions = page.locator('div[role="option"]');
> 63 | await expect(modelOptions.nth(0)).toHaveText('gpt-3.5-turbo');
| ^
64 | await expect(modelOptions.nth(1)).toHaveText('gpt-4');
65 | });
66 |
at /Users/winston/projects/hollama/tests/openai.test.ts:63:37
This next test actually broke with the changes we made to Sessions: trace.zip
I suspect that when you were running the tests locally you didn't have an Ollama server running at the same time, if you do you'll see the error I'm getting in the flash alert.
I think the issue is that we were mocking **/api/tags before and now we need to mock **/api/chat.
2) session.test.ts:720:2 βΊ Session βΊ handles errors when fetching models βββββββββββββββββββββββββ
Test timeout of 5000ms exceeded.
Error: expect(locator).toBeVisible()
Locator: locator('ol[data-sonner-toaster] li').filter({ hasText: 'Can\'t connect to Ollama server' })
Expected: visible
Received: hidden
Call log:
- expect.toBeVisible with timeout 5000ms
- waiting for locator('ol[data-sonner-toaster] li').filter({ hasText: 'Can\'t connect to Ollama server' })
741 | await expect(
742 | page.locator('ol[data-sonner-toaster] li', { hasText: "Can't connect to Ollama server" })
> 743 | ).toBeVisible();
| ^
744 | });
745 |
746 | test('can navigate out of session during completion', async ({ page }) => {
at /Users/winston/projects/hollama/tests/session.test.ts:743:5
:tada: This PR is included in version 0.20.0 :tada:
The release is available on GitHub release
Your semantic-release bot :package::rocket: