rulm
rulm copied to clipboard
convert_to_native.py 70b support
Is there reasons for not supporting 70b converting?
Are n_heads = 64, dim = 8192 for LLaMa v2 70b correct values?