LMOps
LMOps copied to clipboard
No module named 'transformers.models.qwen_parallel.utils_qwen'
when running
python3 tools/convert_mp.py \
--input_path meta-llama/Llama-2-7b-hf \
--source_mp_size 1 \
--target_mp_size 4 \
--model_type llama2 # choose from opt and llama
get ERROR that
Traceback (most recent call last):
File "/home/LMOps/minillm/tools/convert_mp.py", line 6, in <module>
from transformers import (
File "<frozen importlib._bootstrap>", line 1075, in _handle_fromlist
File "/home/LMOps/minillm/transformers/src/transformers/utils/import_utils.py", line 1344, in __getattr__
value = getattr(module, name)
File "/home/LMOps/minillm/transformers/src/transformers/utils/import_utils.py", line 1343, in __getattr__
module = self._get_module(self._class_to_module[name])
File "/home/LMOps/minillm/transformers/src/transformers/utils/import_utils.py", line 1355, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.models.qwen_parallel.utils_qwen because of the following error (look up to see its traceback):
No module named 'transformers.models.qwen_parallel.utils_qwen'
No utils_qwen.py
in minillm.transformers.models.qwen_parallel
You can comment the two lines
https://github.com/microsoft/LMOps/blob/daf972124f0699af18acee85473fece80fb405c2/minillm/tools/convert_mp.py#L11-L20
11 and 20
You can comment the two lines
https://github.com/microsoft/LMOps/blob/daf972124f0699af18acee85473fece80fb405c2/minillm/tools/convert_mp.py#L11-L20
11 and 20
well, If I want to use Qwen model,where to find this file?
You can comment the two lines https://github.com/microsoft/LMOps/blob/daf972124f0699af18acee85473fece80fb405c2/minillm/tools/convert_mp.py#L11-L20
11 and 20
well, If I want to use Qwen model,where to find this file?
I don’t know either, I’m not the original author.