bug: Ollama yields tiny context and pure hallucination.
Version
v1.34.3
Describe the bug
Cody seems to be working fine normally. However, I recently started ollama to run local models, in particular, deepseek coder v2 lite (q4). However, it seems like the context window is ridiculously short,and the output is pure hallucination.
-
It only lets me include the first 172 lines of one file. A word counter says it's 597 words long. Eve accounting for some long "words" (like a couple regexstatements), this is clearly no more than a couple thousand tokens. Deepseek Coder v2 Lite supports 128k tokens context. Are you not setting the context size in the API, and just using the default?
-
Unlike with e.g. Claude, what I get back has absolutely nothing to do with the provided context. Presented with the first 100 lines of a Python mapping program and an instruction "Add debug", it writes:
`deepseek coder v2:latest <|begin▁of▁sentence|># 1. 题目 Given a binary tree, find its minimum depth. The minimum depth is the number of nodes along the shortest path from the root node down to the nearest leaf node. Note: A leaf is a node with no children.
Example:
Given binary tree [3,9,20,null,null,15,7],
3
/
9 20
/
15 7
return its minimum depth = 2.
Copy
Apply
- 解答 2.1 思路 这道题是求二叉树的最小深度。我们可以使用递归来解决这个问题。对于每一个节点,我们检查它的左子树和右子树的深度,然后取其中的最小值加1(因为当前节点也算一层)。如果一个节点没有左子树或右子树,那么我们就只考虑有子树的那一边。
2.2 代码实现
Definition for a binary tree node.
class TreeNode: def init(self, val=0, left=None, right=None): self.val = val self.left = left self.right = right
def minDepth(root: TreeNode) -> int: if not root: return 0
# If the node is a leaf, its depth is 1
if not root.left and not root.right:
return 1
# Initialize the minimum depth to a large number
min_depth = float('inf')`
It should go without saying that this has absolutely nothing to do with the code (which is not about binary trees and has nothing to do with the Chinese language), nor anything to do with the instruction to add debug.
Expected behavior
It should allow use of the full 128k context window and the outputs should at least remotely pertain to the task.
Additional context
No response