nicho2
nicho2
I want used ollama and nomic-embed-text model for embedding from anythingLLM 
ok, i've understand what to do I was misled by the command given on the ollama site: ollama run nomic-embed-text 
It's not necessary to run (just pull the model and : 
same thing with zeno_visualize.py model_args = re.sub( "/|=|:", "__", json.load( open(Path(args.data_path, model, "results.json"), encoding="utf-8") )["config"]["model_args"],