addons icon indicating copy to clipboard operation
addons copied to clipboard

seq2seq.BahdanauAttention raise TypeError: probability_fn is not an instance of str

Open BrandonStudio opened this issue 2 years ago • 9 comments

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): both Windows and Colab
  • TensorFlow version and how it was installed (source or binary): stable latest, pip
  • TensorFlow-Addons version and how it was installed (source or binary): stable latest
  • Python version: 3.10 / 3.9
  • Is GPU used? (yes/no): yes

Describe the bug

call BahdanauAttention with default probability_fn value ("softmax") raises type error

Tried to debug and found that probability_fn was a function when checking type

Code to reproduce the issue

import tensorflow_addons as tfa
tfa.seq2seq.BahdanauAttention(1, 1, 1, normalize=False, name='BahdanauAttention')

Other info / logs

BrandonStudio avatar Mar 19 '23 07:03 BrandonStudio

Hello, have you solved this issue? i also have the same error when initializing tea.seq2seq.LuongAttention.

We-here avatar Mar 20 '23 11:03 We-here

Do you have a very minimal gist to run to reproduce this?

bhack avatar Mar 20 '23 11:03 bhack

@We-here unfortunately no, I have not found a way to bypass type check

BrandonStudio avatar Mar 20 '23 11:03 BrandonStudio

@bhack I think the code above is enough. Exception is thrown when initializing the instance of the class, not afterwards

BrandonStudio avatar Mar 20 '23 11:03 BrandonStudio

Yes cause _process_probability_fn was going to transform str back to the function.

Can you send a PR to the specific test to help reproduce this case?

https://github.com/tensorflow/addons/blob/master/tensorflow_addons/seq2seq/tests/attention_wrapper_test.py

@seanpmorgan

bhack avatar Mar 20 '23 12:03 bhack

Also seeing this issue on google colab.

  • Python version: 3.9
  • Tensorflow version: 2.11.0
  • Tensorflow-addons version: 0.19.0
  • Is GPU used? (yes/no): yes

HybridNeos avatar Mar 20 '23 23:03 HybridNeos

@BrandonStudio I skip the type checking by comment out @typechecked at AttentionMechanism and its derived classes

DangMinh24 avatar Apr 02 '23 18:04 DangMinh24

@DangMinh24 Did you just modify the library source code?

BrandonStudio avatar Apr 03 '23 00:04 BrandonStudio

@BrandonStudio Yeah, I modify tensorflow-addons code in local environment.

DangMinh24 avatar Apr 03 '23 22:04 DangMinh24