FlagEmbedding icon indicating copy to clipboard operation
FlagEmbedding copied to clipboard

Matroyshka_reranker requirements not working

Open Shamepoo opened this issue 6 months ago • 3 comments

Requirements Installation Issue

System Information

  • Operating System: Ubuntu 20.04
  • Python Version: 3.10.12
  • Virtual Environment: Yes

Installation Method

# Please paste the exact command you used
uv pip install -e . in Flagembedding
uv pip install -r requirements.txt in Flagembedding/research/Matroyshka_reranker

Error Message

Traceback (most recent call last):
  File "/data/limite/repos/open-source/FlagEmbedding/research/Matroyshka_reranker/inference/app.py", line 1, in <module>
    from rank_model import MatroyshkaReranker
  File "/data/limite/repos/open-source/FlagEmbedding/research/Matroyshka_reranker/inference/rank_model.py", line 6, in <module>
    from peft import PeftModel
  File "/data/limite/repos/open-source/FlagEmbedding/research/Matroyshka_reranker/inference/reranker/lib/python3.10/site-packages/peft/__init__.py", line 17, in <module>
    from .auto import (
  File "/data/limite/repos/open-source/FlagEmbedding/research/Matroyshka_reranker/inference/reranker/lib/python3.10/site-packages/peft/auto.py", line 32, in <module>
    from .peft_model import (
  File "/data/limite/repos/open-source/FlagEmbedding/research/Matroyshka_reranker/inference/reranker/lib/python3.10/site-packages/peft/peft_model.py", line 37, in <module>
    from transformers import Cache, DynamicCache, EncoderDecoderCache, PreTrainedModel
ImportError: cannot import name 'EncoderDecoderCache' from 'transformers' (/data/limite/repos/open-source/FlagEmbedding/research/Matroyshka_reranker/inference/reranker/lib/python3.10/site-packages/transformers/__init__.py)

Additional Notes

Possible packages: transformers peft flash-attn

When I using peft==0.10.0, the issue was gone, but I get:

Traceback (most recent call last):
  File "/data/limite/repos/open-source/FlagEmbedding/research/Matroyshka_reranker/inference/app.py", line 1, in <module>
    from rank_model import MatroyshkaReranker
  File "/data/limite/repos/open-source/FlagEmbedding/research/Matroyshka_reranker/inference/rank_model.py", line 10, in <module>
    from FlagEmbedding.abc.inference import AbsReranker
  File "/data/limite/repos/open-source/FlagEmbedding/FlagEmbedding/__init__.py", line 2, in <module>
    from .inference import *
  File "/data/limite/repos/open-source/FlagEmbedding/FlagEmbedding/inference/__init__.py", line 2, in <module>
    from .auto_reranker import FlagAutoReranker
  File "/data/limite/repos/open-source/FlagEmbedding/FlagEmbedding/inference/auto_reranker.py", line 5, in <module>
    from FlagEmbedding.inference.reranker.model_mapping import (
  File "/data/limite/repos/open-source/FlagEmbedding/FlagEmbedding/inference/reranker/__init__.py", line 1, in <module>
    from .decoder_only import FlagLLMReranker, LayerWiseFlagLLMReranker, LightWeightFlagLLMReranker
  File "/data/limite/repos/open-source/FlagEmbedding/FlagEmbedding/inference/reranker/decoder_only/__init__.py", line 3, in <module>
    from .lightweight import LightweightLLMReranker as LightWeightFlagLLMReranker
  File "/data/limite/repos/open-source/FlagEmbedding/FlagEmbedding/inference/reranker/decoder_only/lightweight.py", line 13, in <module>
    from .models.gemma_model import CostWiseGemmaForCausalLM
  File "/data/limite/repos/open-source/FlagEmbedding/FlagEmbedding/inference/reranker/decoder_only/models/gemma_model.py", line 54, in <module>
    from .gemma_config import CostWiseGemmaConfig
  File "/data/limite/repos/open-source/FlagEmbedding/FlagEmbedding/inference/reranker/decoder_only/models/gemma_config.py", line 24, in <module>
    from transformers.models.gemma2.configuration_gemma2 import Gemma2Config
ModuleNotFoundError: No module named 'transformers.models.gemma2'

Shamepoo avatar May 27 '25 03:05 Shamepoo

Thank you for providing the information. We have updated the GitHub repository, and you can resolve the issue by running pip install again.

545999961 avatar May 28 '25 07:05 545999961

Thank you for providing the information. We have updated the GitHub repository, and you can resolve the issue by running pip install again.

Traceback (most recent call last):
  File "/data/limite/repos/open-source/FlagEmbedding/research/Matroyshka_reranker/inference/app.py", line 7, in <module>
    reranker = MatroyshkaReranker(
  File "/data/limite/repos/open-source/FlagEmbedding/research/Matroyshka_reranker/inference/rank_model.py", line 96, in __init__
    self.tokenizer = AutoTokenizer.from_pretrained(
  File "/data/limite/repos/open-source/FlagEmbedding/research/Matroyshka_reranker/inference/reranker/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 880, in from_pretrained
    return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
  File "/data/limite/repos/open-source/FlagEmbedding/research/Matroyshka_reranker/inference/reranker/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2110, in from_pretrained
    return cls._from_pretrained(
  File "/data/limite/repos/open-source/FlagEmbedding/research/Matroyshka_reranker/inference/reranker/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2336, in _from_pretrained
    tokenizer = cls(*init_inputs, **init_kwargs)
  File "/data/limite/repos/open-source/FlagEmbedding/research/Matroyshka_reranker/inference/reranker/lib/python3.10/site-packages/transformers/models/llama/tokenization_llama_fast.py", line 156, in __init__
    super().__init__(
  File "/data/limite/repos/open-source/FlagEmbedding/research/Matroyshka_reranker/inference/reranker/lib/python3.10/site-packages/transformers/tokenization_utils_fast.py", line 114, in __init__
    fast_tokenizer = TokenizerFast.from_file(fast_tokenizer_file)
Exception: data did not match any variant of untagged enum ModelWrapper at line 268062 column 3

Thanks for your respond. I got this after update the env.

Shamepoo avatar May 28 '25 07:05 Shamepoo

Hello, we apologize for the issue above. We have updated the code again, and you can proceed to update and use it.

545999961 avatar May 28 '25 08:05 545999961