React to OpenAI SDK 2.10.0-alpha.703: assembly split, fix streaming MessageID on function calls#7510
React to OpenAI SDK 2.10.0-alpha.703: assembly split, fix streaming MessageID on function calls#7510
Conversation
Agent-Logs-Url: https://github.com/dotnet/extensions/sessions/3fcca199-65c7-4765-a983-221f6047d7a5 Co-authored-by: ericstj <8918108+ericstj@users.noreply.github.com>
… function calls Agent-Logs-Url: https://github.com/dotnet/extensions/sessions/3fcca199-65c7-4765-a983-221f6047d7a5 Co-authored-by: ericstj <8918108+ericstj@users.noreply.github.com>
|
Interesting, so the CCA agent restored the package and apparently was able to auth to the github feed, but our AzDo validation build cannot (of course). We could fix that by using |
Agent-Logs-Url: https://github.com/dotnet/extensions/sessions/0a304a10-6f3a-4b97-9bb7-3d05731b62a4 Co-authored-by: ericstj <8918108+ericstj@users.noreply.github.com>
|
We should watch https://github.com/openai/openai-dotnet/actions/workflows/release.yml for new packages to update this PR. |
There was a problem hiding this comment.
Pull request overview
Updates the Microsoft.Extensions.AI.OpenAI integration to track OpenAI .NET SDK 2.10.0-alpha.700, including adapting to SDK model/context changes and fixing streaming MessageId propagation for function-call updates (Issue #7479).
Changes:
- Bump
OpenAIpackage to2.10.0-alpha.700and add a GitHub NuGet feed entry for pre-release OpenAI packages. - Fix streaming function-call updates to carry the correct
MessageIdby tracking the function-call output item id. - Update streaming tests to validate the new
MessageIdbehavior and resulting message grouping inToChatResponse().
Reviewed changes
Copilot reviewed 4 out of 4 changed files in this pull request and generated no comments.
| File | Description |
|---|---|
| test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/OpenAIResponseClientTests.cs | Updates streaming test expectations to assert MessageId on function-call updates and validates ToChatResponse() now yields two messages (function/tool vs text). |
| src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIResponsesChatClient.cs | Sets lastMessageId when a FunctionCallResponseItem is added (so subsequent function-call updates have MessageId), and switches serialization context usages to OpenAIResponsesContext. |
| NuGet.config | Adds github-openai package source and maps OpenAI package IDs to it for pre-release consumption. |
| eng/packages/General.props | Updates the OpenAI package version to 2.10.0-alpha.700. |
|
@copilot I see a new package version 2.10.0-alpha.702 is available. Can you update to this and test? |
Agent-Logs-Url: https://github.com/dotnet/extensions/sessions/a78bbb1c-4d5e-4a70-b048-4956fcf1b3f9 Co-authored-by: ericstj <8918108+ericstj@users.noreply.github.com>
|
I see - it didn't successfully auth to this before -- instead it did find the release packages workflow in openai, downloaded the artifacts locally, and added the local path to the nuget.config for test. https://github.com/dotnet/extensions/agents/pull/7510?session_id=3fcca199-65c7-4765-a983-221f6047d7a5
@copilot please do that, as you did before and run the tests. You can download those build artifacts from the latest run of https://github.com/openai/openai-dotnet/actions/workflows/release.yml, put them in a folder, add a nuget.config to point at that folder, and update the version referenced. Build and test, please. |
Downloaded Results: 375 passed, 200 skipped (expected for non-integration tests), 0 failed — build succeeded with 0 warnings/errors across all target frameworks (net462, net8.0, net9.0, net10.0, netstandard2.0). |
|
@copilot - looks like the latest build is 703. Can you download and test again? https://github.com/openai/openai-dotnet/actions/runs/25530568212 |
Agent-Logs-Url: https://github.com/dotnet/extensions/sessions/de494f88-9828-450f-bee0-69af9cc2d89b Co-authored-by: ericstj <8918108+ericstj@users.noreply.github.com>
Downloaded Results (30ed709): build succeeded with 0 warnings/errors across all target frameworks; tests: 375 passed, 200 skipped, 0 failed. |
Bumps
OpenAIto2.10.0-alpha.703(pre-release build frommain, ahead of ~2.10.1/2.11.0) to validate against the Responses API restructuring where types were factored into a newOpenAI.Responsesassembly. Also fixes a bug where streamingChatResponseUpdate.MessageIdwas alwaysnullfor function call output items.Package update
eng/packages/General.props:2.10.0→2.10.0-alpha.703NuGet.config: addsgithub-openaisource (https://nuget.pkg.github.com/openai/index.json) with explicit source mappings forOpenAIandOpenAI.Responses. Can be removed once a stable release ships to NuGet.org. Requires a GitHub token withread:packagesscope in CI.Assembly split (
OpenAI→OpenAI.Responses)The alpha factors Responses types into a separate
OpenAI.Responsesassembly with its ownOpenAIResponsesContextfor AOT-compatibleModelReaderWriteroperations. Using the oldOpenAIContext.DefaultforResponseToolread/write now throwsInvalidOperationException: No ModelReaderWriterTypeBuilder found for ResponseTool.OpenAIContext.Default→OpenAIResponsesContext.Defaultat all threeModelReaderWritercall sites inOpenAIResponsesChatClient.csusing OpenAI;directiveOpenAI.Responsesarrives transitively viaOpenAI's package dependency — nocsprojchange neededStreaming MessageID fix (issue #7479)
Function call output items in
response.output_item.addedwere not settinglastMessageId, so allChatResponseUpdates emitted during function call streaming hadMessageId = null. Fixed by capturing the item ID forFunctionCallResponseItem:This means
ToChatResponse()now produces two messages when a response contains both function calls and a text message: one message keyed to the function call item ID, one keyed to the text message ID. The test is updated to assert this behavior and to assertfunctionCallUpdate.MessageId == "fc_001".