OpenHands icon indicating copy to clipboard operation
OpenHands copied to clipboard

how to make it run? should i host BAAI/bge-small-en-v1.5 locally?

Open LeoYoungChina opened this issue 1 year ago • 3 comments

make run Running the app... Starting backend server... Waiting for the backend to start... /home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/pydantic/_internal/fields.py:151: UserWarning: Field "model_list" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( /home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/pydantic/_internal/fields.py:151: UserWarning: Field "model_name" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( /home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/pydantic/_internal/fields.py:151: UserWarning: Field "model_group_alias" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( /home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/pydantic/_internal/fields.py:151: UserWarning: Field "model_info" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( /home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/pydantic/_internal/fields.py:151: UserWarning: Field "model_id" has conflict with protected namespace "model".

You may be able to resolve this warning by setting model_config['protected_namespaces'] = (). warnings.warn( ERROR:root: File "/home/legend/anaconda3/envs/devin/bin/uvicorn", line 8, in sys.exit(main()) ^^^^^^ File "/home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/click/core.py", line 1157, in call return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/click/core.py", line 1078, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File "/home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/click/core.py", line 1434, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/click/core.py", line 783, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/uvicorn/main.py", line 409, in main run( File "/home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/uvicorn/main.py", line 575, in run server.run() File "/home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/uvicorn/server.py", line 65, in run return asyncio.run(self.serve(sockets=sockets)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/legend/anaconda3/envs/devin/lib/python3.11/asyncio/runners.py", line 190, in run return runner.run(main) ^^^^^^^^^^^^^^^^ File "/home/legend/anaconda3/envs/devin/lib/python3.11/asyncio/runners.py", line 118, in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete File "/home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/uvicorn/server.py", line 69, in serve await self._serve(sockets) File "/home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/uvicorn/server.py", line 76, in _serve config.load() File "/home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/uvicorn/config.py", line 433, in load self.loaded_app = import_from_string(self.app) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/uvicorn/importer.py", line 19, in import_from_string module = importlib.import_module(module_str) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/legend/anaconda3/envs/devin/lib/python3.11/importlib/init.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "", line 1204, in _gcd_import File "", line 1176, in _find_and_load File "", line 1147, in _find_and_load_unlocked File "", line 690, in _load_unlocked File "", line 940, in exec_module File "", line 241, in _call_with_frames_removed File "/home/legend/OpenDevin/opendevin/server/listen.py", line 12, in import agenthub # noqa F401 (we import this to get the agents registered) ^^^^^^^^^^^^^^^ File "/home/legend/OpenDevin/agenthub/init.py", line 5, in from . import monologue_agent # noqa: E402 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/legend/OpenDevin/agenthub/monologue_agent/init.py", line 2, in from .agent import MonologueAgent File "/home/legend/OpenDevin/agenthub/monologue_agent/agent.py", line 29, in from agenthub.monologue_agent.utils.memory import LongTermMemory File "/home/legend/OpenDevin/agenthub/monologue_agent/utils/memory.py", line 38, in embed_model = HuggingFaceEmbedding( ^^^^^^^^^^^^^^^^^^^^^ File "/home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/llama_index/embeddings/huggingface/base.py", line 86, in init self._model = SentenceTransformer( ^^^^^^^^^^^^^^^^^^^^ File "/home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/sentence_transformers/SentenceTransformer.py", line 199, in init modules = self._load_auto_model( ^^^^^^^^^^^^^^^^^^^^^^ File "/home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/sentence_transformers/SentenceTransformer.py", line 1134, in _load_auto_model transformer_model = Transformer( ^^^^^^^^^^^^ File "/home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/sentence_transformers/models/Transformer.py", line 35, in init config = AutoConfig.from_pretrained(model_name_or_path, **model_args, cache_dir=cache_dir) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/transformers/models/auto/configuration_auto.py", line 1138, in from_pretrained config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/transformers/configuration_utils.py", line 631, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/transformers/configuration_utils.py", line 686, in _get_config_dict resolved_config_file = cached_file( ^^^^^^^^^^^^ File "/home/legend/anaconda3/envs/devin/lib/python3.11/site-packages/transformers/utils/hub.py", line 441, in cached_file raise EnvironmentError(

ERROR:root:<class 'OSError'>: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like BAAI/bge-small-en-v1.5 is not the path to a directory containing a file named config.json. Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

LeoYoungChina avatar Apr 09 '24 08:04 LeoYoungChina

error is : We couldn't connect to 'https://huggingface.co/' to load this file, couldn't find it in the cached files and it looks like BAAI/bge-small-en-v1.5 is not the path to a directory containing a file named config.json. Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

LeoYoungChina avatar Apr 09 '24 09:04 LeoYoungChina

+1, actually I have problems with reaching HF from corporate's network, it could happen also to someone else

andreaspoldi avatar Apr 09 '24 10:04 andreaspoldi

Take a look at this comment https://github.com/OpenDevin/OpenDevin/issues/573#issuecomment-2031923106

Ah, if you can't reach Hugging Face with wget, the contents of that file are this:

{
  "word_embedding_dimension": 384,
  "pooling_mode_cls_token": true,
  "pooling_mode_mean_tokens": false,
  "pooling_mode_max_tokens": false,
  "pooling_mode_mean_sqrt_len_tokens": false
}

Save it as config.json.

enyst avatar Apr 09 '24 10:04 enyst

@enyst should we just hard-code that file into the repo somehow?

rbren avatar Apr 09 '24 18:04 rbren

Looks like this is an internal network issue. But worth simplifying this huggingface download if we can

rbren avatar Apr 09 '24 19:04 rbren

Take a look at this comment #573 (comment)

Ah, if you can't reach Hugging Face with wget, the contents of that file are this:

{
  "word_embedding_dimension": 384,
  "pooling_mode_cls_token": true,
  "pooling_mode_mean_tokens": false,
  "pooling_mode_max_tokens": false,
  "pooling_mode_mean_sqrt_len_tokens": false
}

Save it as config.json.

which folder of project "open devin" shall i put this config.json file in ?

LeoYoungChina avatar Apr 10 '24 06:04 LeoYoungChina

modify this file agenthub/monologue_agent/utils/memory.py

from llama_index.embeddings.huggingface import HuggingFaceEmbedding embed_model = HuggingFaceEmbedding(model_name="BAAI/bge-small-en-v1.5")

replace

from llama_index.core.embeddings import resolve_embed_model embed_model = resolve_embed_model("local:{{embedding_path}})

make run is ok

xanthus-tan avatar Apr 12 '24 09:04 xanthus-tan

from llama_index.core.embeddings import resolve_embed_model embed_model = resolve_embed_model("local:{{embedding_path}})

thx man~!!! up & running image

LeoYoungChina avatar Apr 13 '24 16:04 LeoYoungChina