LLMFarm
LLMFarm copied to clipboard
Can the model get information from the outside?
How can the model get information from the internet or some other source, like pdf on the iPhone?
Can you elaborate on the idea and with an example?
Example/use case:
- reading a PDF and give a summary of the paper
- search the internet for some information
- provide the system some data and let it do some analysis
These are some projects on GitHub which have similar functions, but only on Mac.
https://github.com/imartinez/privateGPT
https://openinterpreter.com
Would be really cool to have similar possibilities on iPhone or IPad.
I think i can add such features after I add support for multimodal models.