xpilasneo4j
xpilasneo4j
Using the GenAI stack from Docker and having built my Ollama on **Windows**, I tried to run the stack and I have this message ``` genai-stack-pull-model-1 | pulling ollama model...
I tried to run the docker compose on AWS and I got an error due to Google credentials Is there a plan to make the demo works on any cloud?
I created a S3 bucket on the Field-Engineering-Pro-Services AWS account and using the AWS access and secret keys, I can't connect the website to my bucket s3://nasa-lessons-learned-files
Writing some code for various LLMs, it works with AzureOpenAI and OpenAI but not with VertexAI Here is my code ``` res_pipeline = await define_and_run_pipeline(llm, file_path) await llm.async_client.close() return res_pipeline...
Similar to https://github.com/neo4j/neo4j-graphrag-python/issues/376, I have an exception when loading data using VertexAI about the return type `neo4j_graphrag.exceptions.SchemaExtractionError: LLM response is not valid JSON.` I fixed it by adding in line...
It seems it created a LOCAL_DATE and it isn't recognized. I reran it on the same document and it worked Not urgent but worth checking ``` ----------- Start C:\Users\XavierPilas\Documents\GitHub\work-rag-graphrag\Small10k\extracted_content-2\page_0065.txt Traceback...
When trying to put the documented parameter _dimensions_ in the constructor, I get an error ``` --------------------------------------------------------------------------- TypeError Traceback (most recent call last) Cell In[25], line 1 ----> 1 embeddings2048=AzureOpenAIEmbeddings(model=config.get('Neo4j','azure_open_ai_emb_model'),...