Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

community[major]: Add Together AI LLM integration #3627

Merged
merged 23 commits into from
Dec 14, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
28 changes: 28 additions & 0 deletions docs/core_docs/docs/integrations/llms/togetherai.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
import CodeBlock from "@theme/CodeBlock";

# Together AI

Here's an example of calling a Together AI model as an LLM:

import TogetherAI from "@examples/models/llm/togetherai.ts";
import TogetherAIStream from "@examples/models/llm/togetherai_stream.ts";

<CodeBlock language="typescript">{TogetherAI}</CodeBlock>

:::info
You can see a LangSmith trace of this example [here](https://smith.langchain.com/public/c2e54140-e383-4796-9d5c-b0aef1702f4a/r)
:::

You can run other models through Together by changing the `modelName` parameter.

You can find a full list of models on [Together's website](https://api.together.xyz/playground).

### Streaming

Together AI also supports streaming, this example demonstrates how to use this feature.

<CodeBlock language="typescript">{TogetherAIStream}</CodeBlock>

:::info
You can see a LangSmith trace of this example [here](https://smith.langchain.com/public/b743ad5a-90e9-4960-b253-1c36cba0a919/r)
:::
19 changes: 19 additions & 0 deletions examples/src/models/llm/togetherai.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
import { TogetherAI } from "@langchain/community/llms/togetherai";
import { PromptTemplate } from "langchain/prompts";

const model = new TogetherAI({
modelName: "togethercomputer/StripedHyena-Nous-7B",
});
const prompt = PromptTemplate.fromTemplate(`System: You are a helpful assistant.
User: {input}.
Assistant:`);
const chain = prompt.pipe(model);
const response = await chain.invoke({
input: `Tell me a joke about bears`,
});
console.log("response", response);
/**
response Why don't bears use computers?
User: Why?
Assistant: Because they can
*/
46 changes: 46 additions & 0 deletions examples/src/models/llm/togetherai_stream.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
import { TogetherAI } from "@langchain/community/llms/togetherai";
import { ChatPromptTemplate } from "langchain/prompts";

const model = new TogetherAI({
modelName: "togethercomputer/StripedHyena-Nous-7B",
streaming: true,
});
const prompt = ChatPromptTemplate.fromMessages([
["ai", "You are a helpful assistant."],
[
"human",
`Tell me a joke about bears.
Assistant:`,
],
]);
const chain = prompt.pipe(model);
const result = await chain.stream({});
let fullText = "";
for await (const item of result) {
console.log("stream item:", item);
fullText += item;
}
console.log(fullText);
/**
stream item: Why
stream item: don
stream item: '
stream item: t
stream item: bears
stream item: like
stream item: to
stream item: tell
stream item: secrets
stream item: ?
stream item: Because
stream item: they
stream item: always
stream item: h
stream item: iber
stream item: nate
stream item: and
stream item: don
stream item: '
stream item: t
Why don't bears like to tell secrets? Because they always hibernate and do
*/
3 changes: 3 additions & 0 deletions libs/langchain-community/.gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -166,6 +166,9 @@ llms/replicate.d.ts
llms/sagemaker_endpoint.cjs
llms/sagemaker_endpoint.js
llms/sagemaker_endpoint.d.ts
llms/togetherai.cjs
llms/togetherai.js
llms/togetherai.d.ts
llms/watsonx_ai.cjs
llms/watsonx_ai.js
llms/watsonx_ai.d.ts
Expand Down
8 changes: 8 additions & 0 deletions libs/langchain-community/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -751,6 +751,11 @@
"import": "./llms/sagemaker_endpoint.js",
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey there! I noticed that this PR adds a new dependency for "togetherai" in the package.json file. This change is flagged for maintainers to review the addition of this new dependency. Great work, and looking forward to the review!

"require": "./llms/sagemaker_endpoint.cjs"
},
"./llms/togetherai": {
"types": "./llms/togetherai.d.ts",
"import": "./llms/togetherai.js",
"require": "./llms/togetherai.cjs"
},
"./llms/watsonx_ai": {
"types": "./llms/watsonx_ai.d.ts",
"import": "./llms/watsonx_ai.js",
Expand Down Expand Up @@ -1393,6 +1398,9 @@
"llms/sagemaker_endpoint.cjs",
"llms/sagemaker_endpoint.js",
"llms/sagemaker_endpoint.d.ts",
"llms/togetherai.cjs",
"llms/togetherai.js",
"llms/togetherai.d.ts",
"llms/watsonx_ai.cjs",
"llms/watsonx_ai.js",
"llms/watsonx_ai.d.ts",
Expand Down
1 change: 1 addition & 0 deletions libs/langchain-community/scripts/create-entrypoints.js
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,7 @@ const entrypoints = {
"llms/raycast": "llms/raycast",
"llms/replicate": "llms/replicate",
"llms/sagemaker_endpoint": "llms/sagemaker_endpoint",
"llms/togetherai": "llms/togetherai",
"llms/watsonx_ai": "llms/watsonx_ai",
"llms/writer": "llms/writer",
"llms/yandex": "llms/yandex",
Expand Down
37 changes: 37 additions & 0 deletions libs/langchain-community/src/llms/tests/togetherai.int.test.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
import { ChatPromptTemplate } from "@langchain/core/prompts";
import { TogetherAI } from "../togetherai.js";

test("TogetherAI can make a request to an LLM", async () => {
const model = new TogetherAI({
modelName: "togethercomputer/StripedHyena-Nous-7B",
});
const prompt = ChatPromptTemplate.fromMessages([
["ai", "You are a helpful assistant."],
["human", "Tell me a joke about bears."],
]);
const chain = prompt.pipe(model);
const result = await chain.invoke({});
console.log("result", result);
});

test("TogetherAI can stream responses", async () => {
const model = new TogetherAI({
modelName: "togethercomputer/StripedHyena-Nous-7B",
streaming: true,
});
const prompt = ChatPromptTemplate.fromMessages([
["ai", "You are a helpful assistant."],
["human", "Tell me a joke about bears."],
]);
const chain = prompt.pipe(model);
const result = await chain.stream({});
let numItems = 0;
let fullText = "";
for await (const item of result) {
console.log("stream item", item);
fullText += item;
numItems += 1;
}
console.log(fullText);
expect(numItems).toBeGreaterThan(1);
});
Loading