Models
The ModelClient protocol, value types, and shipped adapters.
A model client takes a list of messages and tools, runs one turn against a provider, and returns a ModelResponse. The tool loop in bridle._internal.tool_loop calls it repeatedly until the model produces a valid __bridle_return__.
In v0.1.0 the only real adapter is Anthropic. Tests use MockModelClient.
ModelClient
@runtime_checkable
class ModelClient(Protocol):
def complete(
self,
*,
model: str,
messages: Sequence[dict[str, Any]],
tools: Sequence[Tool],
system: str | None = None,
**params: Any,
) -> ModelResponse: ...Available from bridle.models.
ModelResponse
@dataclass(frozen=True)
class ModelResponse:
text: str | None = None
tool_calls: tuple[ToolCall, ...] = ()
stop_reason: StopReason = "end_turn"
input_tokens: int = 0
output_tokens: int = 0| Field | Type | Description |
|---|---|---|
text | str | None | Concatenated text content of the turn. |
tool_calls | tuple[ToolCall, ...] | Any tool invocations the model emitted. |
stop_reason | StopReason | "end_turn" | "tool_use" | "max_tokens" | "other". |
input_tokens | int | Provider-reported input tokens. |
output_tokens | int | Provider-reported output tokens. |
ToolCall
@dataclass(frozen=True)
class ToolCall:
id: str
name: str
input: dict[str, Any] = ...id is the provider's id for the call; the tool loop echoes it back in the matching tool_result. name is the tool's name (or __bridle_return__ for the synthetic return tool).
AnthropicModelClient
from bridle.models.anthropic import AnthropicModelClient, installDrives the Anthropic messages API on Bridle's behalf.
AnthropicModelClient(
*,
client: anthropic.Anthropic | None = None,
max_tokens: int = 4096,
)| Parameter | Type | Description |
|---|---|---|
client | anthropic.Anthropic | None | An existing SDK instance. None creates one with the SDK's defaults. |
max_tokens | int | Default max tokens per turn. Override per-turn via **params. |
install
bridle.models.anthropic.install(
*,
client: anthropic.Anthropic | None = None,
max_tokens: int = 4096,
) -> AnthropicModelClientBuild an AnthropicModelClient and register it as the active client via set_model_client. The convenience for the common case:
from bridle.models.anthropic import install
install()Requires the anthropic package at runtime. If absent, raises ModelError.
MockModelClient
from bridle.models.mock import MockModelClient, text_response, tool_responseA scripted ModelClient for tests. Construct with either a list of pre-baked ModelResponse (popped in order) or a callable that produces a response from each request.
MockModelClient(responses: Sequence[ModelResponse] | ResponseFactory)ResponseFactory = Callable[
[Sequence[dict[str, Any]], Sequence[Tool]],
ModelResponse,
]MockModelClient.calls records every complete call's arguments — assert on it in tests.
Helpers
text_response(text: str) -> ModelResponse # one text-only turn
tool_response(*calls: ToolCall) -> ModelResponse # one tool-use turn