keepsake
keepsake copied to clipboard
Integrate replicate with Hugging Face transformers
Similar to Pytorch Lightning, Hugging Face abstracts away the training loop and provides a Trainer API with callbacks
I will probably be able to take this on in the next few weeks.
How are you thinking about keeping callbacks in this repo (for both this and previous ones (keras, PL)) vs contributing to those library repos like tensorboard and w&b do? Keep in here until a stable v1 release?
Nice, great idea! 🙌
We're going to contribute the PyTorch Lightning integration back to PyTorch Lightning. We included in our repository as a shortcut, because the integration needed changes to their callback API.
Looks like Hugging Face also include callbacks for services in their library, so it probably makes sense for it to live there. I wonder whether we should include it now or wait until our API is a bit more stable. I can imagine us being able to build in backwards compatibility if we ever change something.
As a place to develop it, I wonder if it makes sense to create a separate package? replicate-huggingface
, or whatever. The intention would be to get it into Hugging Face, and the advantage of it being separate is it isn't something we have to remove from Replicate at some point, which is always messy. We can document it or link to it from https://replicate.ai so it gets eyeballs on it.
Also open to including it in Replicate-proper, but this might be a neater solution if we intend to contribute it upstream to Hugging Face at some point...
@srush Do you have any opinions about this, or know someone at Hugging Face who might? :)
I will check!