Tinytorch
Tinytorch copied to clipboard
A really tiny autograd engine
trafficstars
tinytorch
Newest ML framework that you propbaly don't need,
this is really autograd engine backed by numpy
tinytorch.py shall always remain under 1000 lines. if not we will revert commit
$$ f(x) =x^3+x $$
import tinytorch as tt #👀
def f(x):
return x**3 + x
x = tt.tensor((tt.arange(700) - 400)/100 , requires_grad=True)
z = f(x)
z.sum().backward()
print(x.grad)
What can you do with it?
Automatic diffecrtion, yep
import tinytorch as tt
def f(x,y):
return x**2 + x*y + (y**3+y) **0.5
x = tt.rand((5,5), requires_grad=True)
y = tt.rand((5,5), requires_grad=True)
z = f(x,y)
z.sum().backward()
print(x.grad)
print(y.grad)
Train MNITST, no problemo
python mnist.py
GPT?? you bet (yes LLM fr fr)
GPU=1 python mnist.py
note: numpy is too slow to train llm you need to install jax (just using it as faster numpy)
Visulization
If you want to see your computation graph run visulize.py
requirements
pip install graphviz
sudo apt-get install -y graphviz # IDK what to do for windows I use wsl
why this exists
Bcs I was bored
DEV BLOG
- Part 1: pythonstuff/build-tensors
- Part 2: pythonstuff/backward-pass
- Part 3: pythonstuff/refactor-&-cleanup
powerlevel
1.0 - karpathy micrograd (really simple, not much you can do with it)
3.14 - tinytorch (simpile and you can do lot of things with it) <= ❤️
69 - tinygrad (no longer simple you can do lot more)
∞ - pytorch (goat library, that makes gpu go burrr)
contribution guideline
- be nice
- performance optimization / more examples welcome
- doc sources if any
- keep tinytorch.py under 1000 lines
Buy me Chai/Coffee
License
MIT