gpt4all
gpt4all copied to clipboard
Suggestion: Parameter to return a string as an answer to the terminal rather than entering chat mode
Not an issue so much as a suggestion. In the command to interact with the language model, a parameter should directly prompt the system for a query. For instance:
cd chat;./gpt4all-lora-quantized-OSX-intel -q "What is the diameter of Jupiter?"
In this case -q "[String]" cues the system that a direct answer has been requested, the model will then print:
The diameter of Jupiter is 139 822 kilometers (86,881 miles).
Then stop.
This will make it easier for clients to be created, like a more streamlined Python client that could call: 'os.system("cd chat;./gpt4all-lora-quantized-OSX-intel -q "What is the diameter of Jupiter?"")'