sanwei111

Results 40 comments of sanwei111

hello,i met the same issue,have you fix it?could you share me with your mothods

> > hell,in the file of transformer-multibranch-v2,the class of TransformerEncoderLayer--the code are as follow: > > if args.encoder_branch_type is None:#default=None???? > > self.self_attn = MultiheadAttention( > > self.embed_dim, args.encoder_attention_heads, >...

> > thx,what'S the meaning of [attn:1:32:4, dynamic:default:32:4]?could you show some details about the list > > As I mentioned in my last reply, `args.encoder_branch_type` should not be a boolean...

of course ---Original--- From: ***@***.***> Date: Tue, May 24, 2022 09:54 AM To: ***@***.***>; Cc: ***@***.******@***.***>; Subject: Re: [mit-han-lab/lite-transformer] about the global and localfeatures in fig 3 (#35) Same question,...

> great work! > the input is (batch_size, seq_len, channels), is it right that seq_len is a fixed length? why? I found the same question with you, I wonder that...

hello,i have same question with you,how can you fix it finally

> 数据集就qa就行了?要不要给个prompt的形式,比如三元组那种:prompt,input,output

> 一模一样的情况,请问解决了吗 你好,请问怎么解决

excuse me,can you please explain the Class “class RelPartialLearnableMultiHeadAttn(RelMultiHeadAttn):”.I think the cLASS CAN'T MATCH THE FORMULAS IN THE PAPER SO MUCH!