Class that represents a conversation chat memory with a token buffer. It extends the BaseChatMemory class and implements the ConversationTokenBufferMemoryInput interface.

Example

const memory = new ConversationTokenBufferMemory({
llm: new ChatOpenAI({}),
maxTokenLimit: 10,
});

// Save conversation context
await memory.saveContext({ input: "hi" }, { output: "whats up" });
await memory.saveContext({ input: "not much you" }, { output: "not much" });

// Load memory variables
const result = await memory.loadMemoryVariables({});
console.log(result);

Hierarchy (view full)

Implements

Constructors

Properties

aiPrefix: string = "AI"
chatHistory: BaseChatMessageHistory
humanPrefix: string = "Human"
llm: BaseLanguageModelInterface<any, BaseLanguageModelCallOptions>
maxTokenLimit: number = 2000
memoryKey: string = "history"
returnMessages: boolean = false
inputKey?: string
outputKey?: string

Accessors

Methods

  • Loads the memory variables. It takes an InputValues object as a parameter and returns a Promise that resolves with a MemoryVariables object.

    Parameters

    • _values: InputValues

      InputValues object.

    Returns Promise<MemoryVariables>

    A Promise that resolves with a MemoryVariables object.

  • Saves the context from this conversation to buffer. If the amount of tokens required to save the buffer exceeds MAX_TOKEN_LIMIT, prune it.

    Parameters

    • inputValues: InputValues
    • outputValues: OutputValues

    Returns Promise<void>

Generated using TypeDoc