Szymon Ożóg
Szymon Ożóg
I would like to take on this issue, my plan is to provide a ipynb notebook containing comparison of different models that includes: 1. Inference speed and memory usage 2....
The notebook and a readme were posted in this pr: https://github.com/LAION-AI/Open-Assistant/pull/176 If anyone has any feedback or ideas how to expand this work feel free to contact me
Just did a fresh copy of the repository and ran pre-commit without touching anything - same problem placing full command + output ``` G:\DL\OA\Open-Assistant>pre-commit run --all-files trim trailing whitespace.................................................Passed check...
which fails but here is the output from where ``` G:\DL\Open-Assistant>where npm G:\Program Files\nodejs\npm G:\Program Files\nodejs\npm.cmd ```
``` which npm /mnt/g/Program Files/nodejs/npm ```
@Nil-Andreu Hi, help is always appreciated! If I create a function for accessing huggingface API from 1. will you be able to write the rest client that handles the requests...
@Nil-Andreu Amazing! I've made a pr with the function for getting detoxify classification (https://github.com/LAION-AI/Open-Assistant/pull/362) You can start working on it as soon as it gets accepted
reagarding 2.: They have a limited free access that runs on the cpu and is rate limited: ``` The free Inference API may be rate limited for heavy use cases....
I would say yes but I don't have much knowledge about our backend structure @andreaskoepf could you confirm?