aster
aster copied to clipboard
fixed issue number 83:84 Error for attention state missing
The problem is in the _compute_attention() function in line: 52
The attention state is required for the computations in the attention wrapper which is missing.
The attention_state has not been accounted for when calling the function and therefore the missing value is actually this next_attention_state that has not been passed on to the AttentionWrapper.
Hi, Kindly look at the fixes and let me know if any improvements in the same are required.
Regards