-
Notifications
You must be signed in to change notification settings - Fork 472
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to stream response using Next.js Serverless functions instead? #33
Comments
@GorvGoyl You could technically do that: https://vercel.com/blog/streaming-for-serverless-node-js-and-edge-runtimes-with-vercel-functions As mentioned in the above blog post, it is only available in a few environments today. For example, for Next.js app, streaming serverless functions are only available for Next.js 13.2+, and in Route Handlers (that are not prerender). But, in most cases, you should consider using the edge runtime, as it's (fair to say "very") expensive both from cost and performance wise. Also, in the case of streaming response, while edge functions don't have a hard limit for how long they can continue to stream data after the initial HTTP response, serverless functions are still subject to being limited by its execution timeout limits. |
thanks. I use firebase to authenticate user (api request) first but firebase package isn't officially supported in non-node runtime, hence the reason for choosing serverless functions. |
There's a library such as https://github.com/awinogrodzki/next-firebase-auth-edge that lets you implement the authentication on the edge today. And you can further control Middleware invocations using the |
I can't use edge functions as some of the dependencies (Firebase) require node.js so is there a way to stream response from openai and pass it to fronted using serverless functions?
The text was updated successfully, but these errors were encountered: