Gaunt Sloth Assistant - v1.5.0
    Preparing search index...

    Implements

    Index

    Constructors

    Methods

    • Returns Promise<void>

    • Parameters

      • command: GthCommand | undefined
      • configIn: GthConfig
      • Optionalcheckpointer: BaseCheckpointSaver<number>

      Returns Promise<void>

    • Invoke LLM with a message and runnable config. For streaming use #stream method, streaming is preferred if model API supports it. Please note that this when tools are involved, this method will anyway do multiple LLM calls within LangChain dependency.

      Parameters

      • messages: Message[]
      • runConfig: RunnableConfig

      Returns Promise<string>

    • Induce LLM to stream AI messages with a user message and runnable config. When stream is not appropriate use invoke.

      Parameters

      • messages: Message[]
      • runConfig: RunnableConfig

      Returns Promise<IterableReadableStream<string>>

    • Stream agent events as typed AgentStreamEvent objects. Yields text deltas, tool call lifecycle events, and tool results.

      If a tool with metadata.client === true triggers interrupt(), the underlying graph throws GraphInterrupt; this generator catches it and ends cleanly so the caller's transport (e.g. AG-UI SSE) can finish the run with the tool call hanging. Resume the suspended graph via streamWithEventsResume on the same thread id.

      Parameters

      • messages: Message[]
      • runConfig: RunnableConfig

      Returns AsyncGenerator<AgentStreamEvent>

    • Resume a graph that was suspended via interrupt() with the supplied value.

      The runnable config must carry the same thread_id used when the graph was suspended (the checkpointer keys state by thread). The resume value is whatever the suspending tool needs back — for frontend-fulfilled tools this is the value the client sends in forwardedProps.command.resume.

      Parameters

      • resumeValue: unknown
      • runConfig: RunnableConfig

      Returns AsyncGenerator<AgentStreamEvent>