sdraper-CS
sdraper-CS
I am also working on a Mac, and had both this and then subsequently some of the other reported installation problems. Firstly this one - replace '-shared' with '-dylib' After...
Also hitting this. It's related to the magic around `flatten_parameters`, which apparently has changed in Pytorch 1.0 I have not yet had time to look into this in detail, but...
[EDIT - corrected non-train-time behavior (oops!)] Here's my hacky (very) solution, which I **think** is working ok (and should work with both Pytorch 1.0 and earlier versions). It does a...
> @sdraper-CS I am very curious as to what this magic actually does and why is it needed. Could you elaborate on that? The issue is that the changes in...
@DavidNemeskey I am now pretty confident that the approach is working correctly. I have retrained an NER model based on the Lample paper from 2017 with my modified version of...
@DavidNemeskey That's odd. I'm not sure why the `raw_` parameters would not be in the optimizer (as you say anything the `model.parameters()` enumerates should be in the optimized set), however...
@DavidNemeskey That really doesn't make sense to me! Stepping through in the debugger I *AM* getting the _raw variants in the optimizer params (for both SGD and Adam), and it...