research 异常,parsed json with extra tokens: {}
我校验过tavily apikey 它是有效的,
Same error and cannot proceed. whats the solution?
Same error and cannot proceed. whats the solution?
Sorry, I don't have time to study this recently. If you have new progress, please update here. Thanks.
the issue is that the model outputs extra text after the valid JSON output, in my case the system prompt was i think too complex for a 7b model to follow reliably, testing on asking for the specific structure did the trick,
also this function: https://github.com/bytedance/deer-flow/blob/3ed70e11d512718298a307c4c96673071a9dd1ae/web/src/core/utils/json.ts#L1-L17
will give that error anytime the model outputs extra text after the json, i replaced it with:
export function parseJSON<T>(json: string | null | undefined, fallback: T): T {
if (!json) return fallback;
try {
const raw = json
.trim()
.replace(/^```json\s*/, "")
.replace(/^```\s*/, "")
.replace(/\s*```$/, "");
// Extract only the first JSON-like object/array
const match = raw.match(/{[\s\S]*}|\$[\s\S]*\$/); // crude JSON matcher
const jsonFragment = match ? match[0] : raw;
const result = parse(jsonFragment) as T;
// Optional: Only allow success if we actually got structured data
if (typeof result === "object" && result !== null) {
return result;
}
return fallback;
} catch {
return fallback;
}
}
So is there a solution for this? Why do we get it and its ok for others?
using a large enough model that follows the instructions accurately i think
using a large enough model that follows the instructions accurately i think
This is the case when I switch any model of OpenAI, it will hang at the current position
But when I removed web_search_tool, it worked fine. I executed the web_search_tool script alone and it still worked fine. I guess it's an environment problem, but I have already configured the proxy. A bit confused...
So i am using a Mac Mini with m4. What or how should i do it to make this work? Please advise from the beginning. Thanks all
I am not sure if the issue discussed in this thread is related to the malformed json produced by the planner node, but forcing the sampler to choose tokens that are consistent with the desired schema might help here as well. See potential fix: https://github.com/bytedance/deer-flow/issues/151#issuecomment-2969152465