continue can't recognize the content of json file
I download the codellama-7B, continue config.json config as this:
{"title": "LocalServer",
"provider": "openai",
"model": "codellama-7b-Instruct",
"apiBase": "http://localhost:8000/v1/"}
Then I run the llamacpp_mock_api.py , codeLlama can run rightly in my computer , get the post json from continue, generate LLM content correctly, but when I return the json ,the continue can't reecognize the format and show empty, How do you know the json format of Continue, I see the code add "onesix" to the front of json, I can't find json format definition in continue' docs, Is it possible that the Continue plugin updated the format? The current Json generating code is:
"onesix" + jsonify({"choices": [{"delta": {"role": "assistant", "content": response}}]}).get_data(as_text=True)
How I can generate a right json that Continue can show?