fix(openai): support vLLM >=0.19.1 delta.reasoning field for streaming reasoning content#2191
Open
Zelys-DFKH wants to merge 2 commits intostrands-agents:mainfrom
Open
fix(openai): support vLLM >=0.19.1 delta.reasoning field for streaming reasoning content#2191Zelys-DFKH wants to merge 2 commits intostrands-agents:mainfrom
Zelys-DFKH wants to merge 2 commits intostrands-agents:mainfrom
Conversation
…g reasoning content vLLM 0.19.1 renamed the streaming reasoning field from delta.reasoning_content to delta.reasoning. The SDK only checked reasoning_content, silently dropping reasoning output from vLLM backends on the new field name. Check both field names (reasoning_content first for backward compat with DeepSeek/older vLLM, then reasoning), so reasoning content is forwarded regardless of which field the backend populates. Fixes strands-agents#2182 Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
…llback When both delta.reasoning_content and delta.reasoning are present, reasoning_content takes priority (backward compat). Assert this explicitly. Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description
vLLM 0.19.1 renamed the streaming reasoning field from
delta.reasoning_contenttodelta.reasoning. The SDK only checkedreasoning_content, so reasoning output wassilently dropped for users on newer vLLM backends.
OpenAIModel._stream()now checks both field names:reasoning_contentfirst (backwardcompat with DeepSeek and older vLLM), then
reasoning. Whichever is set gets forwarded.The existing OpenAI mock deltas also needed
reasoning=Noneset explicitly.unittest.mock.Mockauto-creates any accessed attribute as a truthy value, so withoutit the fallback would have fired on all the old tests. Three new tests cover the
delta.reasoningpath, the priority behavior when both fields are present, and theexisting
delta.reasoning_contentpath via the updated mocks.Related Issues
#2182
Documentation PR
No documentation changes needed.
Type of Change
Bug fix
Testing
hatch run preparepassed: 2647 passed, 4 skipped.hatch run prepareChecklist
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.