[BUG] <title> DOcker image not running with Ollama
Is there an existing issue for this?
- [X] I have searched the existing issues
Current Behavior
I get the following error on the frontend "Error: Bad Request Retry" In the docker Log I get the errors ▲ Next.js 14.2.5
- Local: http://7663c70366f7:3000
- Network: http://172.26.0.4:3000 ✓ Starting... ✓ Ready in 64ms m [AI_TypeValidationError]: Type validation failed: Value: {"response":"GPT-4o mini is an abbreviation for Google's Pathways API, which is a machine learning model designed to generate text based on user prompts. It is a smaller and more efficient version of the original GPT-4 model, offering faster response times and lower computational requirements while still providing high-quality text generation capabilities."}. Error message: [ { "expected": "'inquire' | 'proceed'", "received": "undefined", "code": "invalid_type", "path": [ "next" ], "message": "Required" } ] at l (/app/.next/server/chunks/859.js:118:25658) at p (/app/.next/server/chunks/859.js:118:25884) ... 3 lines matching cause stack trace ... at async /app/.next/server/chunks/557.js:74:2043 { cause: a [ZodError]: [ { "expected": "'inquire' | 'proceed'", "received": "undefined", "code": "invalid_type", "path": [ "next" ], "message": "Required" } ] at get error [as error] (/app/.next/server/chunks/823.js:94:44194) at l (/app/.next/server/chunks/859.js:118:25684) at p (/app/.next/server/chunks/859.js:118:25884) at k (/app/.next/server/chunks/859.js:118:10708) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async j (/app/.next/server/chunks/557.js:1:35684) at async /app/.next/server/chunks/557.js:74:2043 { issues: [ [Object] ], addIssue: [Function (anonymous)], addIssues: [Function (anonymous)], errors: [ [Object] ] }, value: { response: "GPT-4o mini is an abbreviation for Google's Pathways API, which is a machine learning model designed to generate text based on user prompts. It is a smaller and more efficient version of the original GPT-4 model, offering faster response times and lower computational requirements while still providing high-quality text generation capabilities." } }
This error has been there in previous version as well. I was hoping it was solved by now.
Expected Behavior
To provide the query results
Steps To Reproduce
I run the shenlw/morphic: latest model compiled 23 hours ago
Environment
- OS: Linuxmint
- Browser: Chromium
Anything else?
No response
Hi , what do you mean by docker image not running with ollama can you clarify? Are you usint the docker-compose.yml in the repo? I see you are also listening on Local: http://7663c70366f7:3000/. what is 7663c70366f7? you want that to be localhost. Also how are you accessing the frontend?
It's similar to this comment: https://github.com/miurla/morphic/issues/318#issuecomment-2305979582
The Ollama provider is unstable. Try using a different model or attempt with a different provider.