ProviderInitError when using Perplexity model in OpenCode
Description
When selecting a Perplexity model, in this case Perplexity Sonar, and sending any message you get a full screen error. See Screenshot below.
OpenCode version
0.15.28
Steps to reproduce
- Configure Perplexity API Key with "opencode auth login"
- Start opencode
- Select Perplexity Sonar as model
- Type anything in chat.
Screenshot and/or share link
Operating System
macOS 26.0.1
Terminal
Ghostty
This issue might be a duplicate of existing issues. Please check:
- #1635: Similar ProviderInitError with full-screen display when using a different provider (Anthropic)
- #2950: Identical ProviderInitError on Apple Silicon Mac with same macOS version (26.0.1), also occurs immediately when sending first message
- #352: Provider initialization issues with model selection problems
Feel free to ignore if none of these address your specific case.
maybe the npm package listed in models.dev is incorrect for this model/provider
I got the same error and here is the print-logs. I am on Windows 11.
opencode run hello --print-logs
INFO 2025-10-31T10:19:34 +118ms service=default version=0.15.29 args=["run","hello","--print-logs"] opencode
INFO 2025-10-31T10:19:34 +0ms service=project directory=C:\Users\kchu fromDirectory
INFO 2025-10-31T10:19:34 +25ms service=config path=C:\Users\kchu\.config\opencode\config.json loading
INFO 2025-10-31T10:19:34 +0ms service=config path=C:\Users\kchu\.config\opencode\opencode.json loading
INFO 2025-10-31T10:19:34 +0ms service=config path=C:\Users\kchu\.config\opencode\opencode.jsonc loading
INFO 2025-10-31T10:19:34 +23ms service=bun cmd=["C:\\Users\\kchu\\AppData\\Local\\Microsoft\\WinGet\\Packages\\SST.opencode_Microsoft.Winget.Source_8wekyb3d8bbwe\\opencode.exe","add","@opencode-ai/[email protected]","--exact"] cwd=C:\Users\kchu\.config\opencode running
INFO 2025-10-31T10:19:34 +10ms service=plugin [email protected] loading plugin
INFO 2025-10-31T10:19:34 +47ms service=plugin [email protected] loading plugin
INFO 2025-10-31T10:19:36 +1161ms service=bus type=* subscribing
INFO 2025-10-31T10:19:36 +1ms service=bus type=session.updated subscribing
INFO 2025-10-31T10:19:36 +0ms service=bus type=message.updated subscribing
INFO 2025-10-31T10:19:36 +0ms service=bus type=message.part.updated subscribing
INFO 2025-10-31T10:19:36 +0ms service=format init
INFO 2025-10-31T10:19:36 +0ms service=bus type=file.edited subscribing
INFO 2025-10-31T10:19:36 +4ms service=session id=ses_5c6385d25ffeIAl6KBVgJEfHaW version=0.15.29 projectID=global directory=C:\Users\kchu title=New session - 2025-10-31T10:19:36.027Z time={"created":1761905976027,"updated":1761905976027} created
INFO 2025-10-31T10:19:36 +0ms service=lsp serverIds=deno, typescript, vue, eslint, gopls, ruby-lsp, pyright, elixir-ls, zls, csharp, rust, clangd, svelte, astro, jdtls, lua-ls enabled LSP servers
INFO 2025-10-31T10:19:36 +46ms service=bun code=0 stdout=bun add v1.3.0 (b0a6feca)
installed @opencode-ai/[email protected]
[96.00ms] done
stderr=Saved lockfile
done
INFO 2025-10-31T10:19:36 +3ms service=bus type=session.created publishing
INFO 2025-10-31T10:19:36 +0ms service=bus type=session.updated publishing
INFO 2025-10-31T10:19:36 +18ms service=bus type=message.part.updated subscribing
INFO 2025-10-31T10:19:36 +0ms service=bus type=session.error subscribing
INFO 2025-10-31T10:19:36 +3ms service=session.prompt session=ses_5c6385d25ffeIAl6KBVgJEfHaW prompt
INFO 2025-10-31T10:19:36 +16ms service=bus type=message.updated publishing
INFO 2025-10-31T10:19:36 +14ms service=bus type=message.part.updated publishing
INFO 2025-10-31T10:19:36 +20ms service=bus type=session.updated publishing
INFO 2025-10-31T10:19:36 +5ms service=models.dev file={} refreshing
INFO 2025-10-31T10:19:36 +40ms service=provider init
INFO 2025-10-31T10:19:36 +55ms service=provider providerID=perplexity found
INFO 2025-10-31T10:19:36 +0ms service=provider providerID=opencode found
INFO 2025-10-31T10:19:36 +0ms service=provider providerID=perplexity modelID=sonar getModel
INFO 2025-10-31T10:19:36 +2ms service=provider status=started providerID=perplexity getSDK
INFO 2025-10-31T10:19:37 +1281ms service=provider status=completed duration=1281 providerID=perplexity getSDK
ERROR 2025-10-31T10:19:37 +11ms service=default providerID=perplexity name=ProviderInitError message=ProviderInitError cause=TypeError: fn2 is not a function. (In 'fn2({
name: provider.id,
...options2
})', 'fn2' is undefined) stack=ProviderInitError: ProviderInitError
at <anonymous> (src/provider/provider.ts:403:7)
at processTicksAndRejections (native:7:39) fatal
Error: Unexpected error, check log file at for more details
ProviderInitError: ProviderInitError
data: {
providerID: "perplexity",
},
at src/provider/provider.ts:403:7
at processTicksAndRejections (native:7:39)
433 |
434 | const combined = signals.length > 1 ? AbortSignal.any(signals) : signals[0]
435 |
436 | return fetch(input, {
437 | ...rest,
438 | signal: combined,
^
TypeError: fn2 is not a function. (In 'fn2({
name: provider.id,
...options2
})', 'fn2' is undefined)
at <anonymous> (src/provider/provider.ts:438:21)
Same issue
i have the same proplem with Anthropic.
@JIbald that definitely shouldn't happen, can u run: opencode run hello --print-logs
Maybe add the --model anthropic/claude-sonnet-4-5 flag depending on default / last used model
I am also facing the same issue with Perplexity only.
@rekram1-node thanks for the fast reply. I just found out, that it might be due to dynamic linking, since I'm on nixos. related issue can be found here The logs seem to second this:
...
INFO 2025-11-03T20:11:39 +1ms service=bus type=storage.write publishing
INFO 2025-11-03T20:11:39 +0ms service=bus type=message.updated publishing
INFO 2025-11-03T20:11:39 +0ms service=bus type=storage.write publishing
INFO 2025-11-03T20:11:39 +0ms service=bus type=message.part.updated publishing
INFO 2025-11-03T20:11:39 +1ms service=bus type=storage.write publishing
INFO 2025-11-03T20:11:39 +0ms service=bus type=session.updated publishing
INFO 2025-11-03T20:11:39 +0ms service=provider providerID=anthropic modelID=claude-sonnet-4-5-20250929 getModel
INFO 2025-11-03T20:11:39 +0ms service=provider status=started providerID=anthropic getSDK
INFO 2025-11-03T20:11:39 +1ms service=provider status=completed duration=1 providerID=anthropic getSDK
INFO 2025-11-03T20:11:39 +0ms service=app name=lsp shutdown
INFO 2025-11-03T20:11:39 +0ms service=app name=session shutdown
ERROR 2025-11-03T20:11:39 +5ms service=default providerID=anthropic name=ProviderInitError message=ProviderInitError cause=ResolveMessage: Cannot find module '/home/user/.cache/opencode/node_modules/@ai-sdk/anthropic' from '/$bunfs/root/opencode' fatal
Error: Unexpected error, check log file at for more details
ah yeah that looks like it could be issue for u
I will note that perplexity may be a separate issue I dont tink the linked provider is able to work in opencode currently ill need to take a look
same issue.
Same on mac. Installed via bun
Same here. Hopefully it gets fixed soon, because I use this model for a custom saerch sub-agent :(
Okay should be fixed u may want to: rm ~/.cache/opencode/models.json and then if u restart opencode it should work.
Thank you. Thank you very much for fixing it.
works for me!
i just dont understand why its only showing this models:
perplexity/sonar-reasoning perplexity/sonar perplexity/sonar-pro perplexity/sonar-reasoning-pro
on the webservice i have:
Sonar GPT-5 Claude Sonnet 4.5 Gemini 2.5 Pro Grok 4 Claude Opus 4.1max o3-promax
Is this another issue?
Thank you so much! It works now :D
@kamueone we can add the other models, there is an api that tracks them: https://github.com/sst/models.dev
thx. i will try.