Open-Assistant
Open-Assistant copied to clipboard
Update inference documentation
I recently tried to setup the local inference server for testing different NLG models and found the installation documentation had some missing or incomplete information.
We should update this so the inference server is a standalone service which doesn't require external context or documentation.