Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,11 @@

All notable changes to `uipath_llm_client` (core package) will be documented in this file.

## [1.10.0] - 2026-04-23

### Added
- `uipath.llm_client.utils.sampling` module exposing `DISABLED_SAMPLING_PARAMS`, `disabled_params_from_model_details`, `is_disabled_value`, and `strip_disabled_kwargs`. The helpers use the langchain-openai-style `disabled_params` format (`{name: None | [values]}`) so they compose with the existing `langchain_openai._filter_disabled_params` path. `disabled_params_from_model_details` derives the disabled-param map from a discovery-endpoint `modelDetails` dict (today: `shouldSkipTemperature=True` disables the full sampling set — temperature, top_p, top_k, frequency/presence penalty, seed, logit_bias, logprobs, top_logprobs).

## [1.9.9] - 2026-04-23

### Changed
Expand Down
16 changes: 16 additions & 0 deletions packages/uipath_langchain_client/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,22 @@

All notable changes to `uipath_langchain_client` will be documented in this file.

## [1.10.0] - 2026-04-23

### Added
- `model_details` and `disabled_params` fields on `UiPathBaseLLMClient`, plus a single `@model_validator(mode="after") setup_model_info` that (1) forwards the factory-supplied `model_details` or fetches it from `client_settings.get_model_info`, and (2) sets `disabled_params` to the merge of what the caller passed and what `disabled_params_from_model_details` derives — user keys win on conflicts, so callers can override any derived entry by name.
- `disabled_params` uses the langchain-openai shape (`{name: None | [values]}`), so subclasses inheriting from `ChatOpenAI` / `AzureChatOpenAI` also benefit from the native `_filter_disabled_params` path inside `bind_tools`.
- Runtime stripping in the four `_generate`/`_agenerate`/`_stream`/`_astream` wrappers on `UiPathBaseChatModel` delegates to `uipath.llm_client.utils.sampling.strip_disabled_kwargs`, generic over `disabled_params`. A warning is logged via `self.logger` for each stripped key when a logger is configured. Fixes `anthropic.claude-opus-4-7` rejecting any sampling parameter passed via `.invoke()` / `.ainvoke()` / streams.

### Removed
- The unused `disabled_params` field declaration on `UiPathChat` (now inherited from `UiPathBaseLLMClient`).

### Changed
- Bumped `uipath-llm-client` floor to `>=1.10.0` to match the release that adds `uipath.llm_client.utils.sampling`.

### Known follow-up
- Init-time values set on the instance (`UiPathChat(model="anthropic.claude-opus-4-7", temperature=0.5)`) still flow into the outgoing request body via `_default_params` / the vendor SDK. The runtime invoke-time strip handles `.invoke(..., temperature=...)`; a follow-up will plug the init-time leak using the already-populated `disabled_params`.

## [1.9.9] - 2026-04-23

### Changed
Expand Down
2 changes: 1 addition & 1 deletion packages/uipath_langchain_client/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ readme = "README.md"
requires-python = ">=3.11"
dependencies = [
"langchain>=1.2.15,<2.0.0",
"uipath-llm-client>=1.9.9,<2.0.0",
"uipath-llm-client>=1.10.0,<2.0.0",
]

[project.optional-dependencies]
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
__title__ = "UiPath LangChain Client"
__description__ = "A Python client for interacting with UiPath's LLM services via LangChain."
__version__ = "1.9.9"
__version__ = "1.10.0"
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
from abc import ABC
from collections.abc import AsyncGenerator, Generator, Mapping, Sequence
from functools import cached_property
from typing import Any, ClassVar, Literal
from typing import Any, ClassVar, Literal, Self

from httpx import URL, Response
from langchain_core.callbacks import (
Expand All @@ -38,7 +38,7 @@
from langchain_core.language_models.chat_models import BaseChatModel
from langchain_core.messages import BaseMessage
from langchain_core.outputs import ChatGeneration, ChatGenerationChunk, ChatResult
from pydantic import AliasChoices, BaseModel, ConfigDict, Field
from pydantic import AliasChoices, BaseModel, ConfigDict, Field, model_validator

from uipath.llm_client.httpx_client import (
UiPathHttpxAsyncClient,
Expand All @@ -49,6 +49,10 @@
get_captured_response_headers,
set_captured_response_headers,
)
from uipath.llm_client.utils.sampling import (
disabled_params_from_model_details,
strip_disabled_kwargs,
)
from uipath_langchain_client.settings import (
UiPathAPIConfig,
UiPathBaseSettings,
Expand Down Expand Up @@ -108,6 +112,19 @@ class UiPathBaseLLMClient(BaseModel, ABC):
description="Settings for the UiPath client (defaults based on UIPATH_LLM_SERVICE env var)",
)

model_details: dict[str, Any] | None = Field(
default=None,
description="Per-model capability flags sourced from the discovery endpoint "
"(e.g. {'shouldSkipTemperature': True}). Passed through by the factory; "
"resolved from client_settings.get_model_info otherwise.",
)
disabled_params: dict[str, Any] | None = Field(
default=None,
description="langchain-openai-style map of parameters that must not be sent to "
"this model. Keys are param names; values are None (always disabled) or a list "
"of disallowed values. Derived from ``model_details`` when not provided.",
)

default_headers: Mapping[str, str] | None = Field(
default=None,
description="Caller-supplied request headers. Merged on top of `class_default_headers`; "
Expand Down Expand Up @@ -140,6 +157,39 @@ class UiPathBaseLLMClient(BaseModel, ABC):
description="Logger for request/response logging",
)

@model_validator(mode="after")
def setup_model_info(self) -> Self:
"""Resolve ``model_details`` from discovery and merge ``disabled_params``.

Runs after pydantic has validated the fields, so ``self.client_settings``
(with its ``default_factory``) and ``self.model_name`` are already live.

``model_details`` is resolved once: caller-forwarded value wins, then a
lookup against ``client_settings.get_model_info`` (backed by the
class-cached discovery response), else an empty mapping on failure.

``disabled_params`` is the merge of what the caller passed and what we
can derive from ``model_details`` (via
``disabled_params_from_model_details``). User-provided keys win on
conflicts, so callers can override a derived entry by name.
"""
if self.model_details is None:
try:
info = self.client_settings.get_model_info(
self.model_name,
byo_connection_id=self.byo_connection_id,
)
self.model_details = info.get("modelDetails") or {}
except Exception:
self.model_details = {}

derived = disabled_params_from_model_details(self.model_details) or {}
user_provided = self.disabled_params or {}
merged = {**derived, **user_provided}
self.disabled_params = merged or None

return self

@cached_property
def uipath_sync_client(self) -> UiPathHttpxClient:
"""Here we instantiate a synchronous HTTP client with the proper authentication pipeline, retry logic, logging etc."""
Expand Down Expand Up @@ -364,6 +414,12 @@ def _generate(
run_manager: CallbackManagerForLLMRun | None = None,
**kwargs: Any,
) -> ChatResult:
kwargs = strip_disabled_kwargs(
kwargs,
disabled_params=self.disabled_params,
model_name=self.model_name,
logger=self.logger,
)
set_captured_response_headers({})
try:
result = self._uipath_generate(messages, stop=stop, run_manager=run_manager, **kwargs)
Expand All @@ -389,6 +445,12 @@ async def _agenerate(
run_manager: AsyncCallbackManagerForLLMRun | None = None,
**kwargs: Any,
) -> ChatResult:
kwargs = strip_disabled_kwargs(
kwargs,
disabled_params=self.disabled_params,
model_name=self.model_name,
logger=self.logger,
)
set_captured_response_headers({})
try:
result = await self._uipath_agenerate(
Expand Down Expand Up @@ -416,6 +478,12 @@ def _stream(
run_manager: CallbackManagerForLLMRun | None = None,
**kwargs: Any,
) -> Generator[ChatGenerationChunk, None, None]:
kwargs = strip_disabled_kwargs(
kwargs,
disabled_params=self.disabled_params,
model_name=self.model_name,
logger=self.logger,
)
set_captured_response_headers({})
try:
first = True
Expand Down Expand Up @@ -446,6 +514,12 @@ async def _astream(
run_manager: AsyncCallbackManagerForLLMRun | None = None,
**kwargs: Any,
) -> AsyncGenerator[ChatGenerationChunk, None]:
kwargs = strip_disabled_kwargs(
kwargs,
disabled_params=self.disabled_params,
model_name=self.model_name,
logger=self.logger,
)
set_captured_response_headers({})
try:
first = True
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,6 @@
>>> vectors = embeddings.embed_documents(["Hello world"])
"""

from __future__ import annotations

from pydantic import Field, model_validator
from typing_extensions import Self

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -64,9 +64,9 @@
from langchain_core.utils.pydantic import is_basemodel_subclass
from pydantic import AliasChoices, BaseModel, Field

from uipath.llm_client.utils.model_family import is_anthropic_model_name
from uipath_langchain_client.base_client import UiPathBaseChatModel
from uipath_langchain_client.settings import ApiType, RoutingMode, UiPathAPIConfig
from uipath_langchain_client.utils import is_anthropic_model_name

_DictOrPydanticClass = Union[dict[str, Any], type[BaseModel], type]
_DictOrPydantic = Union[dict[str, Any], BaseModel]
Expand Down Expand Up @@ -179,7 +179,6 @@ class UiPathChat(UiPathBaseChatModel):
seed: int | None = None

model_kwargs: dict[str, Any] = Field(default_factory=dict)
disabled_params: dict[str, Any] | None = None

# OpenAI
logit_bias: dict[str, int] | None = None
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@

from typing import Any

from uipath.llm_client.utils.model_family import is_anthropic_model_name
from uipath_langchain_client.base_client import (
UiPathBaseChatModel,
UiPathBaseEmbeddings,
Expand All @@ -36,7 +37,6 @@
VendorType,
get_default_client_settings,
)
from uipath_langchain_client.utils import is_anthropic_model_name


def get_chat_model(
Expand Down Expand Up @@ -85,12 +85,14 @@ def get_chat_model(
vendor_type=vendor_type,
)
model_family = model_info.get("modelFamily", None)
model_details = model_info.get("modelDetails") or {}

if custom_class is not None:
return custom_class(
model=model_name,
settings=client_settings,
byo_connection_id=byo_connection_id,
model_details=model_details,
**model_kwargs,
)

Expand All @@ -103,6 +105,7 @@ def get_chat_model(
model=model_name,
settings=client_settings,
byo_connection_id=byo_connection_id,
model_details=model_details,
**model_kwargs,
)

Expand Down Expand Up @@ -138,6 +141,7 @@ def get_chat_model(
settings=client_settings,
api_flavor=api_flavor,
byo_connection_id=byo_connection_id,
model_details=model_details,
**model_kwargs,
)
else:
Expand All @@ -150,6 +154,7 @@ def get_chat_model(
settings=client_settings,
api_flavor=api_flavor,
byo_connection_id=byo_connection_id,
model_details=model_details,
**model_kwargs,
)
case VendorType.VERTEXAI:
Expand All @@ -163,6 +168,7 @@ def get_chat_model(
settings=client_settings,
vendor_type=discovered_vendor_type,
byo_connection_id=byo_connection_id,
model_details=model_details,
**model_kwargs,
)

Expand All @@ -174,6 +180,7 @@ def get_chat_model(
model=model_name,
settings=client_settings,
byo_connection_id=byo_connection_id,
model_details=model_details,
**model_kwargs,
)
case VendorType.AWSBEDROCK:
Expand All @@ -188,6 +195,7 @@ def get_chat_model(
model=model_name,
settings=client_settings,
byo_connection_id=byo_connection_id,
model_details=model_details,
**model_kwargs,
)

Expand All @@ -200,6 +208,7 @@ def get_chat_model(
model=model_name,
settings=client_settings,
byo_connection_id=byo_connection_id,
model_details=model_details,
**model_kwargs,
)

Expand All @@ -211,6 +220,7 @@ def get_chat_model(
model=model_name,
settings=client_settings,
byo_connection_id=byo_connection_id,
model_details=model_details,
**model_kwargs,
)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,6 @@
UiPathTooManyRequestsError,
UiPathUnprocessableEntityError,
)
from uipath.llm_client.utils.model_family import (
ANTHROPIC_MODEL_NAME_KEYWORDS,
is_anthropic_model_name,
)
from uipath.llm_client.utils.retry import RetryConfig

__all__ = [
Expand All @@ -34,6 +30,4 @@
"UiPathServiceUnavailableError",
"UiPathGatewayTimeoutError",
"UiPathTooManyRequestsError",
"ANTHROPIC_MODEL_NAME_KEYWORDS",
"is_anthropic_model_name",
]
2 changes: 1 addition & 1 deletion src/uipath/llm_client/__version__.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
__title__ = "UiPath LLM Client"
__description__ = "A Python client for interacting with UiPath's LLM services."
__version__ = "1.9.9"
__version__ = "1.10.0"
2 changes: 0 additions & 2 deletions src/uipath/llm_client/clients/normalized/completions.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,5 @@
"""Completions endpoint for the UiPath Normalized API."""

from __future__ import annotations

import json
from collections.abc import AsyncGenerator, Callable, Generator, Sequence
from typing import Any, Union, get_args, get_origin, get_type_hints
Expand Down
2 changes: 0 additions & 2 deletions src/uipath/llm_client/clients/normalized/embeddings.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,6 @@
Provides synchronous and asynchronous methods for generating text embeddings.
"""

from __future__ import annotations

from typing import Any

from uipath.llm_client.clients.normalized.types import (
Expand Down
Loading