pablogranolabar
pablogranolabar
Hi, thanks for making your work available. Are there any checkpoints available for DialogBERT to experiment with, or does it have to be trained from scratch using main.py? Can you...
Hi again, I am curious about what methods the paper authors used for context with DialogBERT development? Did you use context prepending of input tokens for that? And how many...
gradient checkpointing would be super helpful for training.
e.g. https://github.com/MiscellaneousStuff/openai-whisper-cpu
Hi again Simon! I am trying to use neat-gym with some of the Gym toy text examples that are based on discrete action spaces, but I'm having some issues. From...
Hi Simon, thanks for releasing this project! I am interested in implementing NEAT for function approximation, by sampling data points from two "competing" IoT sensors and then using a simple...
After reading the paper I believe the approach here is Open MPI? If so would CPU parallelism be an option here or would network-bound IPC become the limiting factor? I...
e.g. https://www.together.xyz/blog/releasing-v1-of-gpt-jt-powered-by-open-source-ai
Results in ~11Gb weights vs. 16Gb, implemented in PyTorch now as load_in_8bit=True: https://huggingface.co/hivemind/gpt-j-6B-8bit
### Feature Use Case Implement OpenAI Whisper ASR for SOTA TTS and wakeword triggers. ### Feature Proposal OpenAI recently released Whisper, a SOTA ASR model. Recent development on Whisper include...