Optionalresolvers: AgentResolversOptionalcheckpointer: BaseCheckpointSaver<number>Stream agent events as typed AgentStreamEvent objects. Yields text deltas, tool call lifecycle events, and tool results.
If a tool with metadata.client === true triggers interrupt(), the underlying
graph throws GraphInterrupt; this generator catches it and ends cleanly so the
caller's transport (e.g. AG-UI SSE) can finish the run with the tool call hanging.
Resume the suspended graph via streamWithEventsResume on the same thread id.
Resume a graph that was suspended via interrupt() with the supplied value.
The runnable config must carry the same thread_id used when the graph was
suspended (the checkpointer keys state by thread). The resume value is whatever
the suspending tool needs back — for frontend-fulfilled tools this is the value
the client sends in forwardedProps.command.resume.
Invoke LLM with a message and runnable config. For streaming use #stream method, streaming is preferred if model API supports it. Please note that this when tools are involved, this method will anyway do multiple LLM calls within LangChain dependency.