[Bug]: CLI exception when OpenAI key is NOT configured
What happened?
TLDR:
- From a fresh install, only configured a key for Anthropic
- On running a simple test prompt via the CLI client, application throws an exception
- Root cause appears to be default of GPT-4 when no OpenAI key is configured
client.cli.utils.Standalone:streamMessagedoesn't check whether the given model is usable
The constructor for Standalone sets the model to 'gpt-4-turbo-preview' if no model is passed via arguments and no default is configured. From the doc text, it sounds like an OpenAI key was a hard requirement, but now its possible to use fabric with alternative providers.
Given that the class is fetching available models, it should check if self.model is available or throw with a message stating otherwise.
P.S. Also, it's difficult to troubleshoot issues like this non-invasively since exceptions are printed to stdout without any context or configurable verbosity. It would be better to use a logging library that can dump stack traces when requested. Structured logging would also be nice to have.
Version check
- [X] Yes I was.
Relevant log output
$ echo "The difficulties of prompt engineering from scratch"| fabric -s -p write_micro_essay
Error: 'NoneType' object has no attribute 'chat'
'NoneType' object has no attribute 'chat'
Relevant screenshots (optional)
No response
#686 Adds structured logging with better stack traces, but doesn't address the default behavior.
@patonw I was getting the exact same errors. This could help: https://github.com/danielmiessler/fabric/issues/373#issuecomment-2087998667 or https://github.com/danielmiessler/fabric/issues/373#issuecomment-2219028988