Skip to content

AI messages

You might be looking for...

This section describes the main classes for AI messages:

Visual explanation

This visualization should help you understand the class hierarchy of the messages:

Message class hierarchy

Message categories

Note

To make this section more readable, we're not showing the full path of the classes, but they can also be imported from the conatus.models.inputs_outputs.messages module.

AIMessage module-attribute

Common interface for AI messages.

These are all the messages that can be sent to the AI. A list of these messages should always be valid as a messages or input field for an AI provider.

Note that not all types of messages will be returned by the AI, but only AssistantAIMessage.

UserAIMessage dataclass

UserAIMessage(
    content: str | Iterable[UserAIMessageContentPart],
    role: Literal["user"] = "user",
)

User message.

ATTRIBUTE DESCRIPTION
content

The content of the user message. Can be a string or an iterable of UserAIMessageContentPart instances.

TYPE: str | Iterable[UserAIMessageContentPart]

role

The role of the user message. Always "user".

TYPE: Literal['user']

all_text property

all_text: str

Get the text of the user message.

to_markdown

to_markdown() -> str

Get the text of the user message.

Source code in conatus/models/inputs_outputs/messages.py
def to_markdown(self) -> str:
    """Get the text of the user message."""
    return "## User message\n\n" + (
        "\n\n".join(part.to_markdown() for part in self.content)
        if not isinstance(self.content, str)
        else self.content
    )

__hash__

__hash__() -> int

Hash a user message.

Source code in conatus/models/inputs_outputs/messages.py
@override
def __hash__(self) -> int:
    """Hash a user message."""
    h = 0
    for part in self.content:
        h ^= hash(part)
    return h

AssistantAIMessage dataclass

AssistantAIMessage(
    content: list[AssistantAIMessageContentPart],
    refusal: str | None = None,
    role: Literal["assistant"] = "assistant",
)

Assistant message.

This is the message returned by the AI, although it can be passed as an input to show a conversation history.

ATTRIBUTE DESCRIPTION
content

The content of the assistant message. See AssistantAIMessageContentPart for more information.

TYPE: list[AssistantAIMessageContentPart]

refusal

The refusal of the assistant message.

TYPE: str | None

role

The role of the assistant message. Always "assistant".

TYPE: Literal['assistant']

tool_call_content_parts property

tool_call_content_parts: list[
    AssistantAIMessageContentToolCallPart
]

Get all the tool calls content parts from the assistant message.

tool_call_content_parts_local_execution property

tool_call_content_parts_local_execution: list[
    AssistantAIMessageContentToolCallPart
]

Get all the tool calls requiring local execution.

all_text property

all_text: str

Get all the text from the assistant message, without reasoning.

all_text_including_reasoning property

all_text_including_reasoning: str

Get all the text from the assistant message, including reasoning.

to_markdown

to_markdown() -> str

Get the text of the assistant message.

Source code in conatus/models/inputs_outputs/messages.py
def to_markdown(self) -> str:
    """Get the text of the assistant message."""
    return "## Assistant message\n\n" + "\n\n".join(
        part.to_markdown() for part in self.content
    )

from_text classmethod

from_text(text: str) -> Self

Create an assistant message from a text.

Example

from conatus.models.inputs_outputs.messages import AssistantAIMessage

msg = AssistantAIMessage.from_text("Sure, will do!")
assert msg.all_text == "Sure, will do!"
PARAMETER DESCRIPTION
text

The text to create the assistant message from.

TYPE: str

RETURNS DESCRIPTION
Self

The assistant message.

Source code in conatus/models/inputs_outputs/messages.py
@classmethod
def from_text(cls, text: str) -> Self:
    """Create an assistant message from a text.

    # Example

    ```python
    from conatus.models.inputs_outputs.messages import AssistantAIMessage

    msg = AssistantAIMessage.from_text("Sure, will do!")
    assert msg.all_text == "Sure, will do!"
    ```

    Args:
        text: The text to create the assistant message from.

    Returns:
        The assistant message.
    """
    return cls(
        content=[AssistantAIMessageContentTextPart(text=text)],
        refusal=None,
    )

__hash__

__hash__() -> int

Hash an assistant message.

We hash the content and the refusal.

RETURNS DESCRIPTION
int

The hash of the assistant message.

Source code in conatus/models/inputs_outputs/messages.py
@override
def __hash__(self) -> int:
    """Hash an assistant message.

    We hash the content and the refusal.

    Returns:
        The hash of the assistant message.
    """
    h = 0
    for part in self.content:
        h ^= hash(part)
    h ^= hash(self.refusal)
    return h

ToolResponseAIMessage dataclass

ToolResponseAIMessage(
    content: dict[str, JSONType],
    tool_name: str,
    tool_call_id: str | None,
    success: bool,
    role: Literal["tool"] = "tool",
    for_computer_use: bool = False,
    modified_variables: list[str] | None = None,
)

Tool response message.

This message should not be returned by the AI. It is only used to indicate that the AI has called a tool. In general, you want to put this after the AssistantAIMessage that called the tool.

ATTRIBUTE DESCRIPTION
content

The content of the tool response message.

TYPE: dict[str, JSONType]

tool_name

The name of the tool that was called.

TYPE: str

tool_call_id

The tool call ID of the tool response message.

TYPE: str | None

role

The role of the tool response message. Always "tool".

TYPE: Literal['tool']

for_computer_use

Whether the tool response is for computer use. Some providers demand a different format for computer use mode.

TYPE: bool

modified_variables

The variables that were modified by the tool, if any.

TYPE: list[str] | None

success

Whether the tool call was successful. Anthropic requires this to be true if the tool call is successful, and false otherwise.

TYPE: bool

all_text property

all_text: None

Get the text of the tool response message.

content_as_string property

content_as_string: str

Get the content of the tool response message as a string.

to_markdown

to_markdown() -> str

Get the text of the tool response message.

RETURNS DESCRIPTION
str

The text of the tool response message.

Source code in conatus/models/inputs_outputs/messages.py
def to_markdown(self) -> str:
    """Get the text of the tool response message.

    Returns:
        The text of the tool response message.
    """
    return (
        "## "
        + (
            "Computer use response "
            if self.for_computer_use
            else "Tool response "
        )
        + ("(Success)" if self.success else "(Failure)")
        + "\n\n"
        + "Tool call ID: "
        + (self.tool_call_id or "None")
        + "\n\n"
        + "Modified variables: "
        + (
            ", ".join(self.modified_variables)
            if self.modified_variables
            else "None"
        )
        + "\n\n"
        + "```json\n"
        + self.content_as_string
        + "\n```"
    )

SystemAIMessage dataclass

SystemAIMessage(
    content: str, role: Literal["system"] = "system"
)

System message / prompt.

Note that some AI providers call this a "developer message". AI provider classes will need to operate the distinction between system messages and developer messages.

ATTRIBUTE DESCRIPTION
content

The content of the system message.

TYPE: str

role

The role of the system message. Always "system".

TYPE: Literal['system']

all_text property

all_text: str

Get the text of the system message.

to_markdown

to_markdown() -> str

Get the text of the system message.

Source code in conatus/models/inputs_outputs/messages.py
def to_markdown(self) -> str:
    """Get the text of the system message."""
    return f"## System message\n\n{self.content}"

ConversationAIMessage module-attribute

ConversationAIMessage = (
    UserAIMessage
    | AssistantAIMessage
    | ToolResponseAIMessage
)

Common interface for conversation messages.

These are all the messages that can be part of a conversation. In other words, everything but the SystemAIMessage .

Completion usage

CompletionUsage dataclass

CompletionUsage(
    model_name: str | None = None,
    prompt_tokens: int = 0,
    completion_tokens: int = 0,
    total_tokens: int = 0,
    cached_used_tokens: int | None = None,
    cached_created_tokens: int | None = None,
    extra_fields: dict[str, int | None] | None = None,
    usage_was_never_given: bool = True,
    always_override_previous_usage: bool = False,
)

Bases: Addable

Completion usage statistics.

Note that you can add CompletionUsage instances together, which is useful for accumulating usage during streaming.

model_name class-attribute instance-attribute

model_name: str | None = None

The name of the model used.

prompt_tokens class-attribute instance-attribute

prompt_tokens: int = 0

The number of tokens in the prompt.

completion_tokens class-attribute instance-attribute

completion_tokens: int = 0

The number of tokens in the completion.

total_tokens class-attribute instance-attribute

total_tokens: int = 0

The total number of tokens.

cached_used_tokens class-attribute instance-attribute

cached_used_tokens: int | None = None

The tokens that were resulting from a cache read.

cached_created_tokens class-attribute instance-attribute

cached_created_tokens: int | None = None

The tokens that were resulted in a cache write.

extra_fields class-attribute instance-attribute

extra_fields: dict[str, int | None] | None = None

Extra fields to store additional usage statistics.

These fields need to be addable as well, and we assume that the default value is 0.

usage_was_never_given class-attribute instance-attribute

usage_was_never_given: bool = True

Flag to indicate that the usage was never given.

This is useful to indicate errors if the usage was not sent back by the AI provider.

always_override_previous_usage class-attribute instance-attribute

always_override_previous_usage: bool = False

Flag to indicate that the previous usage should be overridden.

In this case, addition will really mean that the previous usage is forgotten and the new usage is used instead.

cost property

cost: float

Get the price of the completion usage.

__add__

__add__(other: Self | None) -> Self

Add a CompletionUsage and a CompletionUsage.

Note that neither self nor other are modified.

PARAMETER DESCRIPTION
other

The instance of CompletionUsage.

TYPE: Self | None

RETURNS DESCRIPTION
CompletionUsage

The instance of CompletionUsage that is the sum of the two.

Source code in conatus/models/inputs_outputs/usage.py
@override
def __add__(self, other: Self | None) -> Self:
    """Add a `CompletionUsage` and a `CompletionUsage`.

    Note that neither `self` nor `other` are modified.

    Args:
        other: The instance of `CompletionUsage`.

    Returns:
        (CompletionUsage): The instance of `CompletionUsage`
            that is the sum of the two.
    """
    if other is None:
        return self

    if self.always_override_previous_usage:
        return other

    prompt_tokens = self.prompt_tokens + other.prompt_tokens
    completion_tokens = self.completion_tokens + other.completion_tokens
    total_tokens = max(
        self.total_tokens + other.total_tokens,
        prompt_tokens + completion_tokens,
    )
    usage_was_never_given = (
        self.usage_was_never_given or other.usage_was_never_given
    )

    if self.extra_fields is None and other.extra_fields is None:
        extra_fields = None
    else:
        extra_fields = (
            self.extra_fields.copy()
            if self.extra_fields is not None
            else {}
        )
        if other.extra_fields is not None:
            for field, rhs_value in other.extra_fields.items():
                lhs_value = extra_fields.get(field, None)
                match (lhs_value, rhs_value):
                    case (None, None):
                        pass
                    case (None, int()):
                        extra_fields[field] = rhs_value
                    case (int(), None):
                        extra_fields[field] = lhs_value
                    case (int(), int()):  # pragma: no branch
                        extra_fields[field] = lhs_value + rhs_value

    return type(self)(
        model_name=self.model_name,
        prompt_tokens=prompt_tokens,
        completion_tokens=completion_tokens,
        total_tokens=total_tokens,
        extra_fields=extra_fields,
        usage_was_never_given=usage_was_never_given,
    )

__hash__

__hash__() -> int

Hash a completion usage.

RETURNS DESCRIPTION
int

The hash of the completion usage.

Source code in conatus/models/inputs_outputs/usage.py
@override
def __hash__(self) -> int:
    """Hash a completion usage.

    Returns:
        The hash of the completion usage.
    """
    h = 0
    h ^= hash(self.model_name)
    h ^= hash(self.prompt_tokens)
    h ^= hash(self.completion_tokens)
    h ^= hash(self.total_tokens)
    if self.extra_fields is not None:  # pragma: no branch
        for field, value in self.extra_fields.items():
            h ^= hash(field)
            h ^= hash(value)
    h ^= hash(self.usage_was_never_given)
    return h

__radd__

__radd__(other: Self | None) -> Self

Add an incomplete tool call to an incomplete tool call.

PARAMETER DESCRIPTION
other

The other incomplete tool call.

TYPE: Self | None

RETURNS DESCRIPTION
Self

The sum of the two incomplete tool calls.

Source code in conatus/models/inputs_outputs/common.py
def __radd__(self, other: Self | None) -> Self:
    """Add an incomplete tool call to an incomplete tool call.

    Args:
        other: The other incomplete tool call.

    Returns:
        The sum of the two incomplete tool calls.
    """
    return self.__add__(other)