WB
WB
直接用也没报错 from transformers import AutoTokenizer, RoFormerModel import torch tokenizer = AutoTokenizer.from_pretrained("junnyu/roformer_v2_chinese_char_base") model = RoFormerModel.from_pretrained("junnyu/roformer_v2_chinese_char_base") inputs = tokenizer("Hello, my dog is cute", return_tensors="pt") outputs = model(**inputs) last_hidden_states = outputs.last_hidden_state last_hidden_states 能这样用吗?...
https://chr1swallace.github.io/coloc/articles/a06_SuSiE.html In the article, the example provided by the author, there are two rows in the result of susie.res$summary. nsnps hit1 hit2 PP.H0.abf PP.H1.abf PP.H2.abf PP.H3.abf 1: 500 s105 s105...