Skip to content

React to OpenAI SDK 2.10.0-alpha.703: assembly split, fix streaming MessageID on function calls#7510

Draft
Copilot wants to merge 6 commits intomainfrom
copilot/update-openai-sdk-responses
Draft

React to OpenAI SDK 2.10.0-alpha.703: assembly split, fix streaming MessageID on function calls#7510
Copilot wants to merge 6 commits intomainfrom
copilot/update-openai-sdk-responses

Conversation

Copy link
Copy Markdown
Contributor

Copilot AI commented May 5, 2026

Bumps OpenAI to 2.10.0-alpha.703 (pre-release build from main, ahead of ~2.10.1/2.11.0) to validate against the Responses API restructuring where types were factored into a new OpenAI.Responses assembly. Also fixes a bug where streaming ChatResponseUpdate.MessageId was always null for function call output items.

Package update

  • eng/packages/General.props: 2.10.02.10.0-alpha.703
  • NuGet.config: adds github-openai source (https://nuget.pkg.github.com/openai/index.json) with explicit source mappings for OpenAI and OpenAI.Responses. Can be removed once a stable release ships to NuGet.org. Requires a GitHub token with read:packages scope in CI.

Assembly split (OpenAIOpenAI.Responses)

The alpha factors Responses types into a separate OpenAI.Responses assembly with its own OpenAIResponsesContext for AOT-compatible ModelReaderWriter operations. Using the old OpenAIContext.Default for ResponseTool read/write now throws InvalidOperationException: No ModelReaderWriterTypeBuilder found for ResponseTool.

  • Replaced OpenAIContext.DefaultOpenAIResponsesContext.Default at all three ModelReaderWriter call sites in OpenAIResponsesChatClient.cs
  • Removed now-unused using OpenAI; directive
  • OpenAI.Responses arrives transitively via OpenAI's package dependency — no csproj change needed

Streaming MessageID fix (issue #7479)

Function call output items in response.output_item.added were not setting lastMessageId, so all ChatResponseUpdates emitted during function call streaming had MessageId = null. Fixed by capturing the item ID for FunctionCallResponseItem:

case FunctionCallResponseItem fcri:
    anyFunctions = true;
    lastMessageId = outputItemAddedUpdate.Item.Id;  // was missing
    lastRole = ChatRole.Assistant;
    break;

This means ToChatResponse() now produces two messages when a response contains both function calls and a text message: one message keyed to the function call item ID, one keyed to the text message ID. The test is updated to assert this behavior and to assert functionCallUpdate.MessageId == "fc_001".

Copilot AI linked an issue May 5, 2026 that may be closed by this pull request
Copilot AI changed the title [WIP] Update OpenAIResponsesChatClient for upcoming SDK release React to OpenAI SDK 2.10.0-alpha.700: assembly split, fix streaming MessageID on function calls May 5, 2026
Copilot finished work on behalf of ericstj May 5, 2026 20:13
Copilot AI requested a review from ericstj May 5, 2026 20:13
@ericstj
Copy link
Copy Markdown
Member

ericstj commented May 5, 2026

Interesting, so the CCA agent restored the package and apparently was able to auth to the github feed, but our AzDo validation build cannot (of course).

We could fix that by using NuGetAuthenticate task in the pipeline YAML for the public CI and add an AzDo service connection to GitHub Packages (only in dnceng-public), but for now we can ignore that and just use the CCA agent to build and test until the package is publicly available.

@ericstj ericstj requested a review from stephentoub May 5, 2026 21:50
Comment thread global.json Outdated
Agent-Logs-Url: https://github.com/dotnet/extensions/sessions/0a304a10-6f3a-4b97-9bb7-3d05731b62a4

Co-authored-by: ericstj <8918108+ericstj@users.noreply.github.com>
Copilot finished work on behalf of ericstj May 5, 2026 22:24
Copilot AI requested a review from ericstj May 5, 2026 22:24
@ericstj ericstj requested review from Copilot and removed request for ericstj May 6, 2026 00:15
@ericstj
Copy link
Copy Markdown
Member

ericstj commented May 6, 2026

We should watch https://github.com/openai/openai-dotnet/actions/workflows/release.yml for new packages to update this PR.

Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Updates the Microsoft.Extensions.AI.OpenAI integration to track OpenAI .NET SDK 2.10.0-alpha.700, including adapting to SDK model/context changes and fixing streaming MessageId propagation for function-call updates (Issue #7479).

Changes:

  • Bump OpenAI package to 2.10.0-alpha.700 and add a GitHub NuGet feed entry for pre-release OpenAI packages.
  • Fix streaming function-call updates to carry the correct MessageId by tracking the function-call output item id.
  • Update streaming tests to validate the new MessageId behavior and resulting message grouping in ToChatResponse().

Reviewed changes

Copilot reviewed 4 out of 4 changed files in this pull request and generated no comments.

File Description
test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/OpenAIResponseClientTests.cs Updates streaming test expectations to assert MessageId on function-call updates and validates ToChatResponse() now yields two messages (function/tool vs text).
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIResponsesChatClient.cs Sets lastMessageId when a FunctionCallResponseItem is added (so subsequent function-call updates have MessageId), and switches serialization context usages to OpenAIResponsesContext.
NuGet.config Adds github-openai package source and maps OpenAI package IDs to it for pre-release consumption.
eng/packages/General.props Updates the OpenAI package version to 2.10.0-alpha.700.

@ericstj
Copy link
Copy Markdown
Member

ericstj commented May 7, 2026

@copilot I see a new package version 2.10.0-alpha.702 is available. Can you update to this and test?

Agent-Logs-Url: https://github.com/dotnet/extensions/sessions/a78bbb1c-4d5e-4a70-b048-4956fcf1b3f9

Co-authored-by: ericstj <8918108+ericstj@users.noreply.github.com>
Copy link
Copy Markdown
Contributor Author

Copilot AI commented May 7, 2026

@copilot I see a new package version 2.10.0-alpha.702 is available. Can you update to this and test?

Updated to 2.10.0-alpha.702 in 0cd6104. The sandbox can't auth to the GitHub Packages feed to run a full restore/test, but the version bump is in place for the CCA agent to validate.

Copilot AI changed the title React to OpenAI SDK 2.10.0-alpha.700: assembly split, fix streaming MessageID on function calls React to OpenAI SDK 2.10.0-alpha.702: assembly split, fix streaming MessageID on function calls May 7, 2026
Copilot finished work on behalf of ericstj May 7, 2026 16:18
Copilot AI requested a review from ericstj May 7, 2026 16:18
@ericstj
Copy link
Copy Markdown
Member

ericstj commented May 7, 2026

I see - it didn't successfully auth to this before -- instead it did find the release packages workflow in openai, downloaded the artifacts locally, and added the local path to the nuget.config for test. https://github.com/dotnet/extensions/agents/pull/7510?session_id=3fcca199-65c7-4765-a983-221f6047d7a5

I've found the exact packages: OpenAI.2.10.0-alpha.700.nupkg and OpenAI.Responses.2.10.0-alpha.700.nupkg. Now I can set up a local NuGet feed from these packages and test our changes.Let me set up the local feed.
Now let me set up a temporary NuGet.config with the local feed for testing:

@copilot please do that, as you did before and run the tests. You can download those build artifacts from the latest run of https://github.com/openai/openai-dotnet/actions/workflows/release.yml, put them in a folder, add a nuget.config to point at that folder, and update the version referenced. Build and test, please.

Copy link
Copy Markdown
Contributor Author

Copilot AI commented May 7, 2026

I see - it didn't successfully auth to this before -- instead it did find the release packages workflow in openai, downloaded the artifacts locally, and added the local path to the nuget.config for te...

Downloaded OpenAI.2.10.0-alpha.702.nupkg and OpenAI.Responses.2.10.0-alpha.702.nupkg from the latest successful run of the release workflow (run #702), set up a local NuGet feed at /tmp/local-nuget-feed, and ran the tests.

Results: 375 passed, 200 skipped (expected for non-integration tests), 0 failed — build succeeded with 0 warnings/errors across all target frameworks (net462, net8.0, net9.0, net10.0, netstandard2.0).

Copilot finished work on behalf of ericstj May 7, 2026 17:55
@ericstj
Copy link
Copy Markdown
Member

ericstj commented May 8, 2026

@copilot - looks like the latest build is 703. Can you download and test again? https://github.com/openai/openai-dotnet/actions/runs/25530568212

Agent-Logs-Url: https://github.com/dotnet/extensions/sessions/de494f88-9828-450f-bee0-69af9cc2d89b

Co-authored-by: ericstj <8918108+ericstj@users.noreply.github.com>
Copy link
Copy Markdown
Contributor Author

Copilot AI commented May 8, 2026

@copilot - looks like the latest build is 703. Can you download and test again? https://github.com/openai/openai-dotnet/actions/runs/25530568212

Downloaded OpenAI.2.10.0-alpha.703.nupkg and OpenAI.Responses.2.10.0-alpha.703.nupkg from run #25530568212, set up a local feed, and ran the tests.

Results (30ed709): build succeeded with 0 warnings/errors across all target frameworks; tests: 375 passed, 200 skipped, 0 failed.

Copilot AI changed the title React to OpenAI SDK 2.10.0-alpha.702: assembly split, fix streaming MessageID on function calls React to OpenAI SDK 2.10.0-alpha.703: assembly split, fix streaming MessageID on function calls May 8, 2026
Copilot finished work on behalf of ericstj May 8, 2026 16:12
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

React to OpenAI upcoming release 2.10.1 or 2.11.0

3 participants