llama_index
llama_index copied to clipboard
ValueError: shapes (768,) and (1536,) not aligned: 768 (dim 0) != 1536 (dim 0)
When I try to call hunging face embeding model I receive a error message
`embed_model = LangchainEmbedding(HuggingFaceEmbeddings()) # load index
new_index = GPTSimpleVectorIndex.load_from_disk('indexgpt.json',
embed_model=embed_model)
context_str = "Can you give code examples - How to customize embedding??"
# query with embed_model specified
response = new_index.query(
context_str,
mode="default",
verbose=True
)
print(response)`
Full text error:
response = new_index.query( File "/Users/fabiosviatowski/Documents/CODIGO_GERAL/ServiceEnv/lib/python3.9/site-packages/llama_index/indices/base.py", line 417, in query return query_runner.query(query_str, self._index_struct) File "/Users/fabiosviatowski/Documents/CODIGO_GERAL/ServiceEnv/lib/python3.9/site-packages/llama_index/indices/query/query_runner.py", line 149, in query return query_obj.query(query_bundle) File "/Users/fabiosviatowski/Documents/CODIGO_GERAL/ServiceEnv/lib/python3.9/site-packages/llama_index/token_counter/token_counter.py", line 84, in wrapped_llm_predict f_return_val = f(_self, *args, **kwargs) File "/Users/fabiosviatowski/Documents/CODIGO_GERAL/ServiceEnv/lib/python3.9/site-packages/llama_index/indices/query/base.py", line 383, in query response = self._query(query_bundle) File "/Users/fabiosviatowski/Documents/CODIGO_GERAL/ServiceEnv/lib/python3.9/site-packages/llama_index/indices/query/base.py", line 353, in _query tuples = self.get_nodes_and_similarities_for_response(query_bundle) File "/Users/fabiosviatowski/Documents/CODIGO_GERAL/ServiceEnv/lib/python3.9/site-packages/llama_index/indices/query/base.py", line 277, in get_nodes_and_similarities_for_response nodes = self._get_nodes_for_response( File "/Users/fabiosviatowski/Documents/CODIGO_GERAL/ServiceEnv/lib/python3.9/site-packages/llama_index/indices/query/vector_store/base.py", line 49, in _get_nodes_for_response query_result = self._vector_store.query( File "/Users/fabiosviatowski/Documents/CODIGO_GERAL/ServiceEnv/lib/python3.9/site-packages/llama_index/vector_stores/simple.py", line 104, in query top_similarities, top_ids = get_top_k_embeddings( File "/Users/fabiosviatowski/Documents/CODIGO_GERAL/ServiceEnv/lib/python3.9/site-packages/llama_index/indices/query/embedding_utils.py", line 25, in get_top_k_embeddings similarity = similarity_fn(query_embedding, emb) File "/Users/fabiosviatowski/Documents/CODIGO_GERAL/ServiceEnv/lib/python3.9/site-packages/llama_index/embeddings/base.py", line 43, in similarity product = np.dot(embedding1, embedding2) File "<__array_function__ internals>", line 200, in dot ValueError: shapes (768,) and (1536,) not aligned: 768 (dim 0) != 1536 (dim 0)
@fabioaurelios123 I see you are loading a vector index from disk. Are the embeddings in that saved index also calculated from huggingface?
I think there is a conflict here, because the huggingface embeddings are a different length than OpenAI embeddings. Try recreating the index using huggingface
yes exactly what @logan-markewich said! @fabioaurelios123 feel free to reopen the issue / create other issues, but closing this for now
@fabioaurelios123 I see you are loading a vector index from disk. Are the embeddings in that saved index also calculated from huggingface?
I think there is a conflict here, because the huggingface embeddings are a different length than OpenAI embeddings. Try recreating the index using huggingface
@fabioaurelios123 I see you are loading a vector index from disk. Are the embeddings in that saved index also calculated from huggingface?
I think there is a conflict here, because the huggingface embeddings are a different length than OpenAI embeddings. Try recreating the index using huggingface
right