ctransformers
ctransformers copied to clipboard
recover from `transformers 4.34 refactored`
this PR is for #154.
I believe PreTrainedTokenizer super().__init__(**kwargs) attempted to use get_vocab from LLM but self._llm = llm is not yet set, hence it can not access to the LLM.
So I move it down and it works now.
*force-pushed to change author