mods
mods copied to clipboard
Update Perplexity Models in Settings
Good day,
It seems that the Perplexity settings in mods --settings are not up to date with the current perplexity API. According to their docs Perplexity have switched to llama 3.1 models, but it appears that mods settings are still configured to use llama 3.
For instance, the context allowed is only 8K rather than the 128K context allowed with llama 3.1.
Please let me know if we can resolve this or if I'm misunderstanding something about how the settings are configured.