deepmind-research icon indicating copy to clipboard operation
deepmind-research copied to clipboard

Enformer loaded from checkpoint does not work because of "missing positional argument: is_training"

Open frstyang opened this issue 1 year ago • 3 comments

I ran the checkpoint loading portion of the enformer-training.ipynb provided (I believe) by Kyle Taylor and @alimuldal, but the model cannot run a forward pass. It says TypeError: __call__() missing 1 required positional argument: 'is_training' even though is_training=False is explicitly provided as a keyword argument. Is there a workaround/fix for this? I would like to be able to access the .trunk attribute to compute internal embeddings of sequences, but I get the same problem when I try to call that method as well.

image

frstyang avatar Feb 26 '23 06:02 frstyang

I ran the checkpoint loading portion of the enformer-training.ipynb provided (I believe) by Kyle Taylor and @alimuldal, but the model cannot run a forward pass. It says TypeError: __call__() missing 1 required positional argument: 'is_training' even though is_training=False is explicitly provided as a keyword argument. Is there a workaround/fix for this? I would like to be able to access the .trunk attribute to compute internal embeddings of sequences, but I get the same problem when I try to call that method as well.

image

I met the same problem, did you figure it out? Thanks!

yal054 avatar Apr 12 '23 06:04 yal054

This was a hack, but I edited the enformer.py file to change all instances of is_training: bool to is_training: bool = False and then I was able to extract embeddings in the notebook.

frstyang avatar Apr 12 '23 17:04 frstyang

This was a hack, but I edited the enformer.py file to change all instances of is_training: bool to is_training: bool = False and then I was able to extract embeddings in the notebook.

thanks a lot! It saved me a lot of time

yal054 avatar Apr 12 '23 17:04 yal054