Skip to content

fix(openai): support vLLM >=0.19.1 delta.reasoning field for streaming reasoning content#2191

Open
Zelys-DFKH wants to merge 2 commits intostrands-agents:mainfrom
Zelys-DFKH:fix/openai-vllm-reasoning-field
Open

fix(openai): support vLLM >=0.19.1 delta.reasoning field for streaming reasoning content#2191
Zelys-DFKH wants to merge 2 commits intostrands-agents:mainfrom
Zelys-DFKH:fix/openai-vllm-reasoning-field

Conversation

@Zelys-DFKH
Copy link
Copy Markdown
Contributor

Description

vLLM 0.19.1 renamed the streaming reasoning field from delta.reasoning_content to
delta.reasoning. The SDK only checked reasoning_content, so reasoning output was
silently dropped for users on newer vLLM backends.

OpenAIModel._stream() now checks both field names: reasoning_content first (backward
compat with DeepSeek and older vLLM), then reasoning. Whichever is set gets forwarded.

The existing OpenAI mock deltas also needed reasoning=None set explicitly.
unittest.mock.Mock auto-creates any accessed attribute as a truthy value, so without
it the fallback would have fired on all the old tests. Three new tests cover the
delta.reasoning path, the priority behavior when both fields are present, and the
existing delta.reasoning_content path via the updated mocks.

Related Issues

#2182

Documentation PR

No documentation changes needed.

Type of Change

Bug fix

Testing

hatch run prepare passed: 2647 passed, 4 skipped.

  • I ran hatch run prepare

Checklist

  • I have read the CONTRIBUTING document
  • I have added any necessary tests that prove my fix is effective or my feature works
  • I have updated the documentation accordingly
  • I have added an appropriate example to the documentation to outline the feature, or no new docs are needed
  • My changes generate no new warnings
  • Any dependent changes have been merged and published

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

Zelys-DFKH and others added 2 commits April 22, 2026 17:12
…g reasoning content

vLLM 0.19.1 renamed the streaming reasoning field from delta.reasoning_content
to delta.reasoning. The SDK only checked reasoning_content, silently dropping
reasoning output from vLLM backends on the new field name.

Check both field names (reasoning_content first for backward compat with
DeepSeek/older vLLM, then reasoning), so reasoning content is forwarded
regardless of which field the backend populates.

Fixes strands-agents#2182

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…llback

When both delta.reasoning_content and delta.reasoning are present,
reasoning_content takes priority (backward compat). Assert this explicitly.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant