transformers
                                
                                 transformers copied to clipboard
                                
                                    transformers copied to clipboard
                            
                            
                            
                        Issue importing models in jupyter notebooks 'No module named transformers.models.ipynb_checkpoints'
System Info
The following error comes up: ModuleNotFoundError: No module named transformers.models.ipynb_checkpoints'
Packages: ipykernel==6.29.5
- transformersversion: 4.52.4
- Platform: Linux-5.10.226-214.880.amzn2.x86_64-x86_64-with-glibc2.39
- Python version: 3.10.16
- Huggingface_hub version: 0.31.4
- Safetensors version: 0.5.3
- Accelerate version: 1.7.0
- Accelerate config: not found
- DeepSpeed version: not installed
- PyTorch version (GPU?): 2.6.0+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: 
- Using GPU in script?: 
- GPU type: Tesla T4
I'm running the code in jupyter labs in a jupyter notebook.
The code I'm running is as follows:
` from transformers import AutoTokenizer, AutoModel, BitsAndBytesConfig
tokenizer = AutoTokenizer.from_pretrained('Qwen/Qwen3-Embedding-4B', padding_side='left') model = AutoModel.from_pretrained('Qwen/Qwen3-Embedding-4B') `
Please let me know if there's any other information that'd be useful.
The whole error that comes up is:
ModuleNotFoundError Traceback (most recent call last) File ~/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py:2045, in _LazyModule.getattr(self, name) 2044 try: -> 2045 module = self._get_module(self._class_to_module[name]) 2046 value = getattr(module, name)
File ~/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py:2075, in _LazyModule._get_module(self, module_name) 2074 except Exception as e: -> 2075 raise e
File ~/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py:2073, in _LazyModule._get_module(self, module_name) 2072 try: -> 2073 return importlib.import_module("." + module_name, self.name) 2074 except Exception as e:
File /usr/local/lib/python3.10/importlib/init.py:126, in import_module(name, package) 125 level += 1 --> 126 return _bootstrap._gcd_import(name[level:], package, level)
File 
File 
File 
File 
File 
File 
File 
ModuleNotFoundError: No module named 'transformers.models.ipynb_checkpoints'
The above exception was the direct cause of the following exception:
ModuleNotFoundError Traceback (most recent call last) Cell In[2], line 2 1 tokenizer = AutoTokenizer.from_pretrained('Qwen/Qwen3-Embedding-4B', padding_side='left') ----> 2 model = AutoModel.from_pretrained('Qwen/Qwen3-Embedding-4B') 3 # , 4 # quantization_config = bnb_config, 5 # device_map = "auto") 6 7 # We recommend enabling flash_attention_2 for better acceleration and memory saving. 8 # model = AutoModel.from_pretrained('Qwen/Qwen3-Embedding-8B', attn_implementation="flash_attention_2", torch_dtype=torch.float16).cuda()
File ~/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:568, in _BaseAutoModelClass.from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs) 564 return model_class.from_pretrained( 565 pretrained_model_name_or_path, *model_args, config=config, **hub_kwargs, **kwargs 566 ) 567 elif type(config) in cls._model_mapping.keys(): --> 568 model_class = _get_model_class(config, cls._model_mapping) 569 if model_class.config_class == config.sub_configs.get("text_config", None): 570 config = config.get_text_config()
File ~/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:388, in _get_model_class(config, model_mapping) 387 def _get_model_class(config, model_mapping): --> 388 supported_models = model_mapping[type(config)] 389 if not isinstance(supported_models, (list, tuple)): 390 return supported_models
File ~/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:774, in _LazyAutoMapping.getitem(self, key) 772 if model_type in self._model_mapping: 773 model_name = self._model_mapping[model_type] --> 774 return self._load_attr_from_module(model_type, model_name) 776 # Maybe there was several model types associated with this config. 777 model_types = [k for k, v in self._config_mapping.items() if v == key.name]
File ~/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:788, in _LazyAutoMapping._load_attr_from_module(self, model_type, attr) 786 if module_name not in self._modules: 787 self._modules[module_name] = importlib.import_module(f".{module_name}", "transformers.models") --> 788 return getattribute_from_module(self._modules[module_name], attr)
File ~/.local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:700, in getattribute_from_module(module, attr) 698 if isinstance(attr, tuple): 699 return tuple(getattribute_from_module(module, a) for a in attr) --> 700 if hasattr(module, attr): 701 return getattr(module, attr) 702 # Some of the mappings have entries model_type -> object of another model type. In that case we try to grab the 703 # object at the top level.
File ~/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py:2048, in _LazyModule.getattr(self, name) 2046 value = getattr(module, name) 2047 except (ModuleNotFoundError, RuntimeError) as e: -> 2048 raise ModuleNotFoundError( 2049 f"Could not import module '{name}'. Are this object's requirements defined correctly?" 2050 ) from e 2052 elif name in self._modules: 2053 try:
ModuleNotFoundError: Could not import module 'Qwen3Model'. Are this object's requirements defined correctly?
Who can help?
@ArthurZucker This may be relevant to you, apologies if not.
Information
- [ ] The official example scripts
- [ ] My own modified scripts
Tasks
- [ ] An officially supported task in the examplesfolder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
Reproduction
from transformers import AutoTokenizer, AutoModel, BitsAndBytesConfig
tokenizer = AutoTokenizer.from_pretrained('Qwen/Qwen3-Embedding-4B', padding_side='left') model = AutoModel.from_pretrained('Qwen/Qwen3-Embedding-4B')
Expected behavior
That the model is imported correctly