Optional
memoryOptional
config: (RunnableConfig | CallbackManager | (BaseCallbackHandler | BaseCallbackHandlerMethodsClass)[])[]Use .batch() instead. Will be removed in 0.2.0.
This feature is deprecated and will be removed in the future.
It is not recommended for use.
Call the chain on all inputs in the list
Optional
config: RunnableConfig | CallbackManager | (BaseCallbackHandler | BaseCallbackHandlerMethodsClass)[]Optional
tags: string[]Use .invoke() instead. Will be removed in 0.2.0.
Run the core logic of this chain and add to output if desired.
Wraps _call and handles memory.
Return a json-like object representing this chain.
Static
deserializeLoad a chain from a json-like object describing it.
Static
fromLLMA static method that creates an instance of LLMRouterChain from a BaseLanguageModel and a BasePromptTemplate. It takes in an optional options object and returns an instance of LLMRouterChain with the specified LLMChain.
A BaseLanguageModel instance.
A BasePromptTemplate instance.
Optional
options: Omit<LLMRouterChainInput, "llmChain">Optional LLMRouterChainInput object, excluding "llmChain".
An instance of LLMRouterChain.
Generated using TypeDoc
A class that represents an LLM router chain in the LangChain framework. It extends the RouterChain class and implements the LLMRouterChainInput interface. It provides additional functionality specific to LLMs and routing based on LLM predictions.