im-not-tom
im-not-tom
I'm getting same issue with RTX 3060 and LLaMa-Storytelling-30B-4Bit-128g. > KeyError: 'model.layers.37.self_attn.rotary_emb.cos_cached' How did you guessed number of pre_layer-ed layers?
Hello and sorry to bother you, I made a attempt at doing this and would like to ask for correction, as probably gravely misunderstood something. Without dumping my entire code,...
@clementine-of-whitewind Interesting, I haven't noticed that. Which params?
Okay. It seems to work with strings for me as well, but storing them that way was certainly bug which should be now fixed in this PR branch.
@SillyLossy I don't mind at all.