c_AMPs-prediction
c_AMPs-prediction copied to clipboard
Attention模型问题
马博士您好 @mayuefine,我在使用Attention模型时报错:Traceback (most recent call last):
File "/media/hjg-r940/user/shanxinxin/antipep/test/c_AMPs-prediction-master/script/prediction_attention.py", line 9, in
Layer.add_weight() got multiple values for argument 'name'
Call arguments received by layer "module_wrapper" (type ModuleWrapper): • args=('tf.Tensor(shape=(None, 57, 64), dtype=float32)',) • kwargs={'training': 'None'} 看起来像是输入文件的问题,我使用您的pl脚本进行格式转换的,不知道为什么还会出现这个错误?还有能否告知您的邮箱?谢谢🙏
马博士您好 @mayuefine,我在使用Attention模型时报错:Traceback (most recent call last): File "/media/hjg-r940/user/shanxinxin/antipep/test/c_AMPs-prediction-master/script/prediction_attention.py", line 9, in model = load_model('../Models/att.h5', custom_objects={'Attention_layer': Attention_layer}) File "/media/hjg-r940/user/shanxinxin/software/anaconda3/envs/tensorflow/lib/python3.10/site-packages/keras/utils/traceback_utils.py", line 70, in error_handler raise e.with_traceback(filtered_tb) from None File "/media/hjg-r940/user/shanxinxin/antipep/test/c_AMPs-prediction-master/script/Attention.py", line 25, in build self.W = self.add_weight((input_shape[-1], input_shape[-1],), TypeError: Exception encountered when calling layer "module_wrapper" (type ModuleWrapper).
Layer.add_weight() got multiple values for argument 'name'
Call arguments received by layer "module_wrapper" (type ModuleWrapper): • args=('tf.Tensor(shape=(None, 57, 64), dtype=float32)',) • kwargs={'training': 'None'} 看起来像是输入文件的问题,我使用您的pl脚本进行格式转换的,不知道为什么还会出现这个错误?还有能否告知您的邮箱?谢谢🙏
Reply: 这个问题我也遇到了。应该是python版本的问题,在Attention.py这个程序第22行和28行中,Python3.6默认.add_weight函数第一个参数是shape,但是Python3.11默认第一个参数是name。解决办法就是在22行和28行的那个参数位置上都加上shape =