Horace He
Horace He
std decomposition is likely incorrect since it calls into var. ``` @register_decomposition(aten.var.correction) def var_decomposition(x: Tensor, dims: Optional[List[int]], correction: int = 0, keepdim: bool = False): if dims is None: dims...
Segmentation usage: ``` fcn_resnet50: 273 fcn_resnet101: 598 deeplabv3_resnet50: 328 deeplabv3_resnet101: 845 ``` Choosing to add `deeplabv3_resnet101` and `fcn_resnet50` (for some diversity in backbones). ``` fasterrcnn_resnet50_fpn: 1362 retinanet_resnet50_fpn: 43 maskrcnn_resnet50_fpn: 1207...
https://github.com/dragen1860/MAML-Pytorch/issues/59
An inner optimization loop is when the functional autograd API is used in the forward pass and then the user differentiates through it. Uses include [MAML](https://github.com/dragen1860/MAML-Pytorch), [Energy Based Models](https://github.com/swyoon/pytorch-energy-based-model), and...
It's a pretty common file format for people working with ocaml writing compilers, and the deprecated ocaml plugin has support for it.
Stack from [ghstack](https://github.com/ezyang/ghstack) (oldest at bottom): * __->__ #93059 cc @mlazos @soumith @voznesenskym @yanboliang @penguinwu @anijain2305 @EikanWang @jgong5 @Guobing-Chen @chunyuan-w @XiaobingSuper @zhuhaozhe @blzheng @Xia-Weiwen @wenzhe-nrv @jiayisunx @peterbell10 @desertfire
Stack from [ghstack](https://github.com/ezyang/ghstack) (oldest at bottom): * __->__ #93039 cc @mlazos @soumith @voznesenskym @yanboliang @penguinwu @anijain2305 @EikanWang @jgong5 @Guobing-Chen @chunyuan-w @XiaobingSuper @zhuhaozhe @blzheng @Xia-Weiwen @wenzhe-nrv @jiayisunx @desertfire
In this case I'm guessing that for fp8 you might not need a scale parameter for the weights, since each weight has its own scaling factor. I haven't done any...
Copied from https://github.com/VSCodeVim/Vim/issues/1032 ### What did you do? I tried to use 'normal @a ### What did you expect to happen? Apply macro into lines selected ### What happened instead?...