Skip to content

Commit

Permalink
docs[patch]: Use an example file for docs code examples (#3567)
Browse files Browse the repository at this point in the history
* docs[patch]: Use an example file for docs code examples

* cr

* cr

* cr
  • Loading branch information
bracesproul authored Dec 6, 2023
1 parent f67a293 commit c0bb153
Show file tree
Hide file tree
Showing 2 changed files with 52 additions and 22 deletions.
29 changes: 7 additions & 22 deletions docs/core_docs/docs/expression_language/how_to/with_history.mdx
Original file line number Diff line number Diff line change
@@ -1,5 +1,11 @@
---
sidebar_class_name: hidden
hide_table_of_contents: true
---

import CodeBlock from "@theme/CodeBlock";
import Example from "@examples/guides/expression_language/runnable_history.ts";
import ExampleConstructorConfig from "@examples/guides/expression_language/runnable_history_constructor_config.ts";

# Add message history (memory)

Expand Down Expand Up @@ -29,25 +35,4 @@ To do this, the only change you need to make is remove the second arg (or just t

This is a simple example building on top of what we have above:

```typescript
const config: RunnableConfig = { configurable: { sessionId: "1" } };

const withHistory = new RunnableWithMessageHistory({
runnable,
getMessageHistory: (_sessionId: string) => messageHistory,
inputMessagesKey: "input",
historyMessagesKey: "history",
// Passing config through here instead of through the invoke method
config,
});

let output = await withHistory.invoke({ input: "Hello there, I'm Archibald!" });
console.log("output 1:", output);
/**
output 1: AIMessage {
lc_namespace: [ 'langchain_core', 'messages' ],
content: 'Hello, Archibald! How can I assist you today?',
additional_kwargs: { function_call: undefined, tool_calls: undefined }
}
*/
```
<CodeBlock language="typescript">{ExampleConstructorConfig}</CodeBlock>
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
import { ChatOpenAI } from "langchain/chat_models/openai";
import { ChatMessageHistory } from "langchain/stores/message/in_memory";
import { ChatPromptTemplate, MessagesPlaceholder } from "langchain/prompts";
import {
RunnableConfig,
RunnableWithMessageHistory,
} from "langchain/runnables";

// Construct your runnable with a prompt and chat model.
const model = new ChatOpenAI({});
const prompt = ChatPromptTemplate.fromMessages([
["ai", "You are a helpful assistant"],
new MessagesPlaceholder("history"),
["human", "{input}"],
]);
const runnable = prompt.pipe(model);
const messageHistory = new ChatMessageHistory();

// Define a RunnableConfig object, with a `configurable` key.
const config: RunnableConfig = { configurable: { sessionId: "1" } };
const withHistory = new RunnableWithMessageHistory({
runnable,
getMessageHistory: (_sessionId: string) => messageHistory,
inputMessagesKey: "input",
historyMessagesKey: "history",
// Passing config through here instead of through the invoke method
config,
});

const output = await withHistory.invoke({
input: "Hello there, I'm Archibald!",
});
console.log("output:", output);
/**
output: AIMessage {
lc_namespace: [ 'langchain_core', 'messages' ],
content: 'Hello, Archibald! How can I assist you today?',
additional_kwargs: { function_call: undefined, tool_calls: undefined }
}
*/

/**
* You can see the LangSmith traces here:
* output @link https://smith.langchain.com/public/ee264a77-b767-4b5a-8573-efcbebaa5c80/r
*/

2 comments on commit c0bb153

@vercel
Copy link

@vercel vercel bot commented on c0bb153 Dec 6, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@vercel
Copy link

@vercel vercel bot commented on c0bb153 Dec 6, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Successfully deployed to the following URLs:

langchainjs-docs – ./docs/core_docs/

langchainjs-docs-langchain.vercel.app
langchainjs-docs-ruddy.vercel.app
js.langchain.com
langchainjs-docs-git-main-langchain.vercel.app

Please sign in to comment.