vim-autograd icon indicating copy to clipboard operation
vim-autograd copied to clipboard

Automatic differentiation library written in pure Vim script.

vim-autograd

Automatic differentiation library written in pure Vim script.

test test-vim9

vim-autograd provides a foundation for automatic differentiation through the Define-by-Run style algorithm such as Chainer or PyTorch. Since it is written completely in pure Vim script, there are no dependencies.

This library allows us to create next-generation plugins with numerical computation of multidimensional arrays or deep learning using the gradient descent method.

Installation

Vim script

If you are using vim-plug, can install as follows.

Plug 'pit-ray/vim-autograd'

Vim9 script

If you want to use the more efficient Vim9 script, please install the experimental vim9 branch implementation.

Plug 'pit-ray/vim-autograd', {'branch': 'vim9'}

Usage

A computational graph is constructed by applying the provided differentiable functions to a Tensor object, and the gradient is calculated by backpropagating from the output.

function! s:f(x) abort
  " y = x^5 - 2x^3
  let y = autograd#sub(a:x.p(5), a:x.p(3).m(2))
  return y
endfunction

function! s:example() abort
  let x = autograd#tensor(2.0)
  let y = s:f(x)

  call y.backward()
  echo x.grad.data
endfunction

call s:example()

Output

[56.0]

The computational graph is automatically generated like the below.

Examples

  • Basic differentiation and computational graph visualization
  • Higher-order differentiation using double-backprop
  • Classification using deep learning

Related posts

  • https://zenn.dev/pitray/articles/482e89ddff329c

References

License

This library is provided by MIT License.

Author

  • pit-ray