cortex
cortex copied to clipboard
Quickstart setup step 4 (not working)
Discussed in https://github.com/janhq/cortex/discussions/235
Originally posted by zaynpatel December 4, 2023 I'm currently running the following command, copied from the documentation with the exception of a new localhost address:
url http://localhost:3000/inferences/llamacpp/loadmodel \
-H 'Content-Type: application/json' \
-d '{
"llama_model_path": "/model/model/llama-2-7b-model.gguf",
"ctx_len": 512,
"ngl": 100
}'
I'm getting a 404 error which references that the inferences/llamacpp/loadmodel
is not an available route.
{"url":"/inferences/llamacpp/loadmodel","statusCode":404,"statusMessage":"Cannot find any route matching /inferences/llamacpp/loadmodel.","message":"Cannot find any route matching /inferences/llamacpp/loadmodel."}%
I'm curious about how to proceed and wonder how I can test what other load model links might be correct?