pytorch_forward_forward icon indicating copy to clipboard operation
pytorch_forward_forward copied to clipboard

Question? How can this be applied to a basic GPT transformer text generator?

Open creativeautomaton opened this issue 2 years ago • 4 comments

do you know of how to apply this to a GPT like model, I assumed a from scratch pytorch model.

creativeautomaton avatar Feb 15 '23 17:02 creativeautomaton

ff is very limiting right now, but i think it is something one can work on.

Allaye avatar Feb 26 '24 13:02 Allaye

It would be very beneficial for more fine tuning be especially for edge device models like those in the STM32 model zoo ie: tensorflow lite and onnx models for micro controllers.

Last year I was interested in this and wrote to [email protected] a post grad who did a few sample repos on using Forward Forward from Geoffery Hintons paper examples, the MINST working code he had but Hadia's work was a bit to complex for me i think and could not get a functioning example using Forward Forward in GPT type NN.

But this would be a great subject of research and could allow for super low resource ML and open up Analog computation for not just inference of models but maybe even training.

I see it as a kind of digital/analog biology way to train machines. meh i digress.

creativeautomaton avatar Feb 26 '24 18:02 creativeautomaton

Agreed, the potential of FF is immense, spanning from edge devices to controllers. Currently, my team and I are researching FF, focusing on computer vision. Could you please direct me to Hadia's work? so that i could take a look.

Allaye avatar Feb 26 '24 18:02 Allaye

yes sorry, here is the repo I experimented with. https://github.com/ghadialhajj/FF_unsupervised

creativeautomaton avatar Feb 26 '24 18:02 creativeautomaton