Universal-Transformer-Pytorch icon indicating copy to clipboard operation
Universal-Transformer-Pytorch copied to clipboard

It looks like ACT does not upate state

Open spolu opened this issue 6 years ago • 5 comments

state passed to fn does not seem to be updated by ACT's masks, only previous_state ?

https://github.com/andreamad8/Universal-Transformer-Pytorch/blob/master/models/UTransformer.py#L280

As such the dynamic halting seems to only kick in once all halting_probabilities go above threshold?

spolu avatar Jul 25 '19 12:07 spolu

Probably caused by https://github.com/andreamad8/Universal-Transformer-Pytorch/blob/master/models/UTransformer.py#L288 ?

spolu avatar Jul 25 '19 12:07 spolu

Hi @spolu,

thanks for spotting this problem. I will check it asap.

andreamad8 avatar Aug 06 '19 16:08 andreamad8

Hi, any updates? Thanks very much

shizhediao avatar Nov 10 '19 09:11 shizhediao

I think the implementation is correct.

In the paper, the while loop has init variable

(state, step, halting_probability, reminders, n_updates, previous_state)

after the loop body, it returns

(transformed_state, step, halting_probability, reminders, n_updates, new_state)

So new_state becomes the previous_state in the next iteration, and line 283 updates the state. (doesn't matter if you call it transformed_state or state)

zhiqihuang avatar Jan 19 '22 03:01 zhiqihuang

I am not convinced that the code in the paper is correct, as it does not match the ACT implementation of Graves. See https://github.com/MostafaDehghani/Thesis/issues/1

EliasHasle avatar Apr 19 '23 12:04 EliasHasle