Files
oh-my-openagent/docs/troubleshooting/ollama-streaming-issue.md
Doyoon Kwon 895f366a11 docs: add Ollama streaming NDJSON issue guide and workaround (#1197)
* docs: add Ollama streaming NDJSON issue troubleshooting guide

- Document problem: JSON Parse error when using Ollama with stream: true
- Explain root cause: NDJSON vs single JSON object mismatch
- Provide 3 solutions: disable streaming, avoid tool agents, wait for SDK fix
- Include NDJSON parsing code example for SDK maintainers
- Add curl testing command for verification
- Link to issue #1124 and Ollama API docs

Fixes #1124

* docs: add Ollama provider configuration with streaming workaround

- Add Ollama Provider section to configurations.md
- Document stream: false requirement for Ollama
- Explain NDJSON vs single JSON mismatch
- Provide supported models table (qwen3-coder, ministral-3, lfm2.5-thinking)
- Add troubleshooting steps and curl test command
- Link to troubleshooting guide

feat: add NDJSON parser utility for Ollama streaming responses

- Create src/shared/ollama-ndjson-parser.ts
- Implement parseOllamaStreamResponse() for merging NDJSON lines
- Implement isNDJSONResponse() for format detection
- Add TypeScript interfaces for Ollama message structures
- Include JSDoc with usage examples
- Handle edge cases: malformed lines, stats aggregation

This utility can be contributed to Claude Code SDK for proper NDJSON support.

Related to #1124

* fix: use logger instead of console, remove trailing whitespace

- Replace console.warn with log() from shared/logger
- Remove trailing whitespace from troubleshooting guide
- Ensure TypeScript compatibility
2026-01-28 19:01:33 +09:00

3.4 KiB

Ollama Streaming Issue - JSON Parse Error

Problem

When using Ollama as a provider with oh-my-opencode agents, you may encounter:

JSON Parse error: Unexpected EOF

This occurs when agents attempt tool calls (e.g., explore agent using mcp_grep_search).

Root Cause

Ollama returns NDJSON (newline-delimited JSON) when stream: true is used in API requests:

{"message":{"tool_calls":[{"function":{"name":"read","arguments":{"filePath":"README.md"}}}]}, "done":false}
{"message":{"content":""}, "done":true}

Claude Code SDK expects a single JSON object, not multiple NDJSON lines, causing the parse error.

Why This Happens

  • Ollama API: Returns streaming responses as NDJSON by design
  • Claude Code SDK: Doesn't properly handle NDJSON responses for tool calls
  • oh-my-opencode: Passes through the SDK's behavior (can't fix at this layer)

Solutions

Configure your Ollama provider to use stream: false:

{
  "provider": "ollama",
  "model": "qwen3-coder",
  "stream": false
}

Pros:

  • Works immediately
  • No code changes needed
  • Simple configuration

Cons:

  • Slightly slower response time (no streaming)
  • Less interactive feedback

Option 2: Use Non-Tool Agents Only

If you need streaming, avoid agents that use tools:

  • Safe: Simple text generation, non-tool tasks
  • Problematic: Any agent with tool calls (explore, librarian, etc.)

Option 3: Wait for SDK Fix (Long-term)

The proper fix requires Claude Code SDK to:

  1. Detect NDJSON responses
  2. Parse each line separately
  3. Merge tool_calls from multiple lines
  4. Return a single merged response

Tracking: https://github.com/code-yeongyu/oh-my-opencode/issues/1124

Workaround Implementation

Until the SDK is fixed, here's how to implement NDJSON parsing (for SDK maintainers):

async function parseOllamaStreamResponse(response: string): Promise<object> {
  const lines = response.split('\n').filter(line => line.trim());
  const mergedMessage = { tool_calls: [] };

  for (const line of lines) {
    try {
      const json = JSON.parse(line);
      if (json.message?.tool_calls) {
        mergedMessage.tool_calls.push(...json.message.tool_calls);
      }
      if (json.message?.content) {
        mergedMessage.content = json.message.content;
      }
    } catch (e) {
      // Skip malformed lines
      console.warn('Skipping malformed NDJSON line:', line);
    }
  }

  return mergedMessage;
}

Testing

To verify the fix works:

# Test with curl (should work with stream: false)
curl -s http://localhost:11434/api/chat \
  -d '{
    "model": "qwen3-coder",
    "messages": [{"role": "user", "content": "Read file README.md"}],
    "stream": false,
    "tools": [{"type": "function", "function": {"name": "read", "description": "Read a file", "parameters": {"type": "object", "properties": {"filePath": {"type": "string"}}, "required": ["filePath"]}}}]
  }'

Getting Help

If you encounter this issue:

  1. Check your Ollama provider configuration
  2. Set stream: false as a workaround
  3. Report any additional errors to the issue tracker
  4. Provide your configuration (without secrets) for debugging