Liu Jun

Results 8 issues of Liu Jun

The customized model is not in your "Supported Models" list. Can it benefit from Deepspeed chat?

I test Flash attention vs HF GPT2 with pytorch lightning warp. But it is slow than transformers.GPT2LMHeadModel with same config parameters. Not sure where I am going wrong? ![image](https://github.com/lucidrains/x-transformers/assets/902005/15b363a0-9ca4-426c-aa81-7b0617d44162) Purple...

I can't find it in energonai/engine.py "from energonai.engine import InferenceEngine"

@MogicianXD 想用于NLP研究。不需要用户数据。

in charref(): ` if name[0] in ["x", "X"]: c = int(name[1:], 16) else: c = int(name) ` if name like "xabcdeabcdeabcdeabcde" will cause chr(c) throw OverflowError.

例如从头训练一个1B的llama2架构的模型.

help wanted