Yoshiki Miura
Yoshiki Miura
The likely cause was an inconsistency in the generation state. Disabling the `new` button during generation has probably fixed it: #223
Is there an update? Today, there was also a price change for gemini-1.5-flash. https://ai.google.dev/pricing
It's similar to this comment: https://github.com/miurla/morphic/issues/318#issuecomment-2305979582 The Ollama provider is unstable. Try using a different model or attempt with a different provider.
Since the changes are included in #320, this PR will be closed.
The deployment failed due to changes in the `bun.lockb` file. The `--frozen-lockfile` option expects no changes, but modifications were detected. To resolve this: 1. Run `bun install` locally to update...
Have you checked the Vercel Runtime Logs? https://vercel.com/docs/observability/runtime-logs#view-runtime-logs
@irosyadi Thank you for sharing. Those error messages will also be output to Vercel's error logs. First, please check both the client and server error logs.
I agree. However, it depends on the team's resources, so it’s undecided at this stage.
Thank you for reporting this. We will check and improve this mode.
@ZainGithub12 > (Quality) GPT mode used. Could you please check again to see if this mode can be reproduced now that it has been reverted to the previous version?