-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
📦 NEW: Online Store Customer Service Example With Agents #28
base: main
Are you sure you want to change the base?
📦 NEW: Online Store Customer Service Example With Agents #28
Conversation
Reviewer has to change the environment vars before deploying |
const body = await req.json() | ||
const { messages, threadId } = body | ||
|
||
const response = await fetch('http://localhost:8787', { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@AIENGINE Please use our SDK instead of directly calling the API. I see that you're also calling the localhost
URL here, not sure why though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure will do. This chatbot has split implementation frontend (nextjs) and backend (cf worker)
@@ -0,0 +1,43 @@ | |||
import { OpenAIStream, StreamingTextResponse } from 'ai' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@AIENGINE Our SDK has its own streaming functions. No need to use this package.
}) | ||
|
||
// Create a stream from the response | ||
const stream = OpenAIStream(response) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@AIENGINE I see that in the example pipe of this demo, you have disabled the streaming mode of the pipe. Not sure why are you using stream here after that. Also, in the current setting you will get a JSON string as a response, which you will need to parse and then extract the message from it. You will then return that message to the frontend.
Please let me know if you have any questions.
📦 NEW: Online store customer service example with agents based on https://langbase.com/examples/online-store-customer-service
This chatbot consists of front end (online-cs-chatbot) and backend (cloudflare worker online-cs-chatbot-backend)
The following files were updated
The following files were added