Wrapper around TogetherAI API for large language models fine-tuned for chat

TogetherAI API is compatible to the OpenAI API with some limitations. View the full API ref at:

Link

To use, you should have the TOGETHER_AI_API_KEY environment variable set.

Example

const model = new ChatTogetherAI({
temperature: 0.9,
apiKey: process.env.TOGETHER_AI_API_KEY,
});

const response = await model.invoke([new HumanMessage("Hello there!")]);
console.log(response);

Hierarchy (view full)

Constructors

  • Parameters

    • Optional fields: Partial<Omit<OpenAIChatInput, "functions" | "frequencyPenalty" | "presencePenalty" | "openAIApiKey" | "logitBias">> & BaseLanguageModelParams & {
          apiKey?: string;
          togetherAIApiKey?: string;
      }

    Returns ChatTogetherAI

Methods

  • Calls the TogetherAI API with retry logic in case of failures.

    Parameters

    • request: ChatCompletionCreateParamsStreaming

      The request to send to the TogetherAI API.

    • Optional options: any

      Optional configuration for the API call.

    Returns Promise<AsyncIterable<ChatCompletionChunk>>

    The response from the TogetherAI API.

  • Calls the TogetherAI API with retry logic in case of failures.

    Parameters

    • request: ChatCompletionCreateParamsNonStreaming

      The request to send to the TogetherAI API.

    • Optional options: any

      Optional configuration for the API call.

    Returns Promise<ChatCompletion>

    The response from the TogetherAI API.

Generated using TypeDoc