openhands.memory.conversation_memory
ConversationMemory Objects
class ConversationMemory()
Processes event history into a coherent conversation for the agent.
process_events
def process_events(condensed_history: list[Event],
initial_messages: list[Message],
max_message_chars: int | None = None,
vision_is_active: bool = False,
enable_som_visual_browsing: bool = False) -> list[Message]
Process state history into a list of messages for the LLM.
Ensures that tool call actions are processed correctly in function calling mode.
Arguments:
state
- The state containing the history of events to convertcondensed_history
- The condensed list of events to processinitial_messages
- The initial messages to include in the resultmax_message_chars
- The maximum number of characters in the content of an event included in the prompt to the LLM. Larger observations are truncated.vision_is_active
- Whether vision is active in the LLM. If True, image URLs will be included.enable_som_visual_browsing
- Whether to enable visual browsing for the SOM model.
process_initial_messages
def process_initial_messages(with_caching: bool = False) -> list[Message]
Create the initial messages for the conversation.
apply_prompt_caching
def apply_prompt_caching(messages: list[Message]) -> None
Applies caching breakpoints to the messages.
For new Anthropic API, we only need to mark the last user or tool message as cacheable.