-
Notifications
You must be signed in to change notification settings - Fork 121
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Timeouts on Vercel deployments #506
Comments
Is it a typo that you never register the composer instance called Either way, this should not affect the reported problem. If you don't register it, the middleware tree will just complete faster. As a first step, I would try to simplify the code so you can narrow down where the problem is. Do you use serverless functions or edge functions? The correct adapter is different for the two runtimes. Did you try the minimal example for vercel from our example bots repository at https://github.com/grammyjs/examples/tree/main/setups, or did you follow the guide at https://grammy.dev/hosting/vercel? What changes if you use a minmal example with a single bot instance, rather than creating the bot on the fly? (I don't think it's related to that, but this just means that you can throw it out in order to pinpoint the issue.) |
Hey @KnorpelSenf! I'm using Serverless Functions for API routes. I also tried to simplify the implementation to use only Regarding the middleware it's not a typo, I omitted some code (don't know why, actually). Here is the full route
|
I also tried to log every step with Sentry and can see that the latest log is displayed in my console. So, my assumption is that something goes wrong in import { NextRequest, NextResponse } from "next/server";
import {
Bot,
Composer,
webhookCallback,
session,
enhanceStorage,
} from "grammy";
import { conversations, createConversation } from "@grammyjs/conversations";
import { freeStorage } from "@grammyjs/storage-free";
import * as Sentry from "@sentry/nextjs"
import { handlers } from "@/bot/handlers";
import { start } from "@/bot/commands/start";
import { request } from "@/bot/commands/request";
import { request as requestConversation } from "@/bot/conversations/request";
import { details } from "@/bot/callbacks/details";
import { supabase } from "@/utils/supabase";
import type { Context } from "@/bot/types";
export const POST = async (req: NextRequest, ...args: any[]) => {
const { data, error } = await supabase
.from("companies")
.select("bot_token")
.eq("slug", req.headers.get("host")!.split(".")[0])
.single();
if (error) {
return NextResponse.json({ ...error }, { status: 500 });
}
const token = data.bot_token;
Sentry.captureMessage('middleware = new Composer')
const middleware = new Composer<Context>();
Sentry.captureMessage('middleware = new Composer done')
Sentry.captureMessage('middleware Composer use')
middleware.command("start", start);
middleware.command("request", request);
Sentry.captureMessage('middleware Composer use done')
Sentry.captureMessage(`new Bot: ${token}`)
const bot = new Bot<Context>(token);
Sentry.captureMessage('new Bot done')
Sentry.captureMessage('bot.use')
bot.use(
session({
initial: () => ({
name: "",
slug: "",
phone: "",
}),
storage: enhanceStorage({
storage: freeStorage(token),
millisecondsToLive: 10 * 60 * 1000,
}),
})
);
Sentry.captureMessage('bot.use done')
Sentry.captureMessage('bot.use conversations')
bot.use(conversations());
bot.use(createConversation(requestConversation, "request"));
Sentry.captureMessage('bot.use conversations done')
Sentry.captureMessage('bot.use handlers')
bot.use(handlers);
bot.use(middleware);
Sentry.captureMessage('bot.use handlers done')
Sentry.captureMessage('bot.callbackQuery')
bot.callbackQuery("request", (ctx) => ctx.conversation.enter("request"));
bot.callbackQuery("details", details);
Sentry.captureMessage('bot.callbackQuery done')
Sentry.captureMessage('bot.message')
bot.on("message", async (ctx) => {
await ctx.reply("Я поки що не знаю що з цим робити");
});
Sentry.captureMessage('bot.message done')
bot.catch((err) => {
Sentry.captureException(err);
return NextResponse.json({ ...err }, { status: 500 });
});
Sentry.captureMessage('handleUpdate')
const handleUpdate = webhookCallback(bot, "std/http", "throw", 15_000);
Sentry.captureMessage('handleUpdate done')
return handleUpdate(req, ...args);
}; |
Ah well, we do not have support for The way these framework adapters work is that you first need to take a look at how your particular server expects its middleware to be. For import { NextResponse } from 'next/server'
import type { NextRequest } from 'next/server'
export function middleware(request: NextRequest) {
return NextResponse.next();
} grammY defines its framework adapters in this file: https://github.com/grammyjs/grammY/blob/1c238c0f08df047dc8dc11dd069519cc4c68b7ee/src/convenience/frameworks.ts Look at how every adapter maps function signatures like the above to a generic (We might be able to add support by simply returning a Until then, grammY provides a callback adapter that works with any framework: grammY/src/convenience/webhook.ts Lines 14 to 24 in 1c238c0
This means that something similar to the following code will work (no promises, coded on github.com): import { NextResponse } from 'next/server'
import type { NextRequest } from 'next/server'
const handleUpdate = webhookCallback(bot, "callback");
export function middleware(request: NextRequest) {
const update = await request.json();
const header = request.headers.get("X-Telegram-Bot-Api-Secret-Token");
return await handleUpdate(update, (json: string) => new NextResponse(json), header);
} I'm curious if this works, please keep us posted. |
@KnorpelSenf looking into the code I see that
|
You're right, this needs to be fixed. It was caused by an incomplete refactoring some time ago.
Hmmm then perhaps you need to go even one more step back and provide a |
@all-contributors add @thecoorum for the bug |
I've put up a pull request to add @thecoorum! 🎉 |
I see that Also, as I mentioned before the default configuration for |
I guess you need to create a NextResponse object?
Right. I have no idea about this one. I have never used nextjs myself. What is the difference between the local and the production environment? (grammY itself certainly doesn't behave differently, it is not aware of its surroundings.) |
Looking into the structure of I suspect that maybe |
I tried to extract the logic from the import { type NextRequest, NextResponse } from "next/server";
import { Bot } from "grammy";
const bot = new Bot(process.env.REQUESTS_BOT_TOKEN!);
bot.on("message", async (ctx) => {
await ctx.reply("Ping");
});
function timeoutIfNecessary(
task: Promise<void>,
onTimeout: "throw" | "return" | (() => unknown),
timeout: number
): Promise<void> {
if (timeout === Infinity) return task;
return new Promise((resolve, reject) => {
const handle = setTimeout(() => {
if (onTimeout === "throw") {
reject(new Error(`Request timed out after ${timeout} ms`));
} else {
if (typeof onTimeout === "function") onTimeout();
resolve();
}
}, timeout);
task
.then(resolve)
.catch(reject)
.finally(() => clearTimeout(handle));
});
}
export const POST = async (req: NextRequest) => {
let initialized = false;
if (!initialized) {
await bot.init();
initialized = true;
}
let usedWebhookReply = false;
const webhookReplyEnvelope = {
send: async (json: any) => {
usedWebhookReply = true;
await new Promise((resolve) => resolve(NextResponse.json(json)));
},
};
await timeoutIfNecessary(
bot.handleUpdate(await req.json(), webhookReplyEnvelope),
"throw",
10_000
);
if (!usedWebhookReply) {
return NextResponse.json(null, { status: 200 });
}
}; |
This sort of gives me the feeling that neither of us is making obvious mistakes in the code. It sort of boils down to differences between dev and prod, such as having different implementations of global objects like Request/Response/Promise. The above code is a fairly short example that reproduces the issue (https://sscce.org). It could be a good idea to contact the people from nextjs to find out why the code behaves differently.
This is expected. It is only present if you configure it when setting your webhook. |
I opened a discussion in the NextJS repo, let's see if any useful suggestions will appear there |
Nice, subscribed. You may wanna include the above code in the discussion so that people don't need to understand grammY before they're able to look into the issue. What happens if you throw out the |
Do you mean just calling |
Yep! That would be the next step in narrowing down the problem. By continuing to remove seemingly unrelated code, we either end up removing the code that causes the problem, or we end up with a tiny bit of code that causes the problem. Either way, we will have isolated it, which allows the bug to be fixed (either by us or by them). |
Hey @KnorpelSenf! Sorry for long reply, wasn't able to test out the suggestion you made during weekends. I tried to implement it now, but it still failing with timeout. Including the source code of the endpoint and the screenshot of log for triggering the endpoint
import { type NextRequest, NextResponse } from "next/server";
import { Bot } from "grammy";
import { supabase } from "@/utils/supabase";
const bot = new Bot(process.env.REQUESTS_BOT_TOKEN!);
bot.on("message::bot_command", async (ctx) => {
// Match command pattern /process_<id>
const match = ctx.message!.text!.match(/^\/process_(\d+)$/);
if (!match) return;
const id = match[1];
const { data, error } = await supabase
.from("requests")
.select()
.eq("id", id)
.single();
if (error) {
await ctx.reply("Виникла помилка при завантаженні заявки.");
await ctx.reply(error.message);
return;
}
await ctx.reply(
`
<pre><code>
company_name: ${data.company_name}
company_slug: ${data.company_slug}
phone_number: ${data.phone_number}
user_id: ${data.user_id}
user_username: ${data.user_username}
</code></pre>
`
);
});
bot.on("message", async (ctx) => {
await ctx.reply("Ping");
});
export const POST = async (req: NextRequest) => {
let initialized = false;
if (!initialized) {
await bot.init();
initialized = true;
}
let usedWebhookReply = false;
const webhookReplyEnvelope = {
send: async (json: any) => {
usedWebhookReply = true;
await new Promise((resolve) => resolve(NextResponse.json(json)));
},
};
await bot.handleUpdate(await req.json(), webhookReplyEnvelope);
if (!usedWebhookReply) {
return NextResponse.json(null, { status: 200 });
}
}; |
Awesome! Just to be sure, the webhook reply envelope is never used, right? You didn't enable the feature. So you should be able to empty This should leave you with <20 lines of code that have virtually no logic and still reproduce the issue. Can you confirm? (Perhaps you now see where I'm going with this.) |
i'm so glad there is at least an issue... i've been dancing with it for too long... i use latest grammy |
Nobody really knows. I'm not using nextjs so I haven't investigated it. |
Very interesting stuff. @thecoorum can you confirm that this fixes it? |
At some point I decided to migrate my bot to Deno, so it will take me some time to replicate the existing bot back on Next.js. I will post an update as soon as I will do some testings |
Hmm, despite upgrading Vercel's Node version to import { NextRequest } from "next/server";
import {
Bot,
Composer,
webhookCallback,
session,
enhanceStorage,
} from "grammy";
import { conversations, createConversation } from "@grammyjs/conversations";
import { freeStorage } from "@grammyjs/storage-free";
import { handlers } from "@/bot/handlers";
import { start } from "@/bot/commands/start/admin";
import { request } from "@/bot/commands/request";
import { description } from "@/bot/commands/description";
import { process as processCommand } from "@/bot/commands/process";
import { request as requestConversation } from "@/bot/conversations/request";
import { process as processConversation } from "@/bot/conversations/process";
import { details } from "@/bot/callbacks/details";
import type { Context } from "@/bot/types";
const token = process.env.ADMIN_BOT_TOKEN!;
const middleware = new Composer<Context>();
middleware.command("start", start);
middleware.on("message::bot_command", processCommand);
const bot = new Bot<Context>(token);
bot.use(
session({
initial: () => ({}),
storage: enhanceStorage<{}>({
storage: freeStorage(token),
millisecondsToLive: 10 * 60 * 1000,
}),
})
);
bot.use(conversations());
bot.use(createConversation(requestConversation, "request"));
bot.use(createConversation(processConversation, "process"));
bot.use(handlers);
bot.use(middleware);
bot.callbackQuery("request", (ctx) => ctx.conversation.enter("request"));
bot.callbackQuery("description", details);
bot.on("message", async (ctx) => {
// ...
});
// bot.catch((error) => {
// Sentry.captureException(error);
// });
const handleUpdate = webhookCallback(bot, "std/http");
export const POST = async (req: NextRequest, ...rest: any[]) => {
return handleUpdate(req, ...rest);
}; |
@thecoorum i thought it's the node version, but when then it failed again, so i now know it's not.. but good news is that is that it doesnt matter, bc i anyway made it work :) this is my webhook code in src/app/api/bot/route.ts:
this is the script i run after each next js build:
my local node version is v20 |
What is the key difference between this and the code in the original issue description? |
I can't see a real difference between the initial code and the working one, only the export code style, but that shouldn't matter |
Just to be clear, the two of you are using the same code with the same hosting provider and you observe different behaviour? That means that it isn't related to your code, but rather to something else entirely. I honestly don't see how grammY can have something to do with this, so I don't think it will be fixed in the library (unless new evidence shows up). Feel free to close this issue, or keep it open and discuss further, whatever you prefer. :) |
I will close this, as I do not see what we can do here. Feel free to reopen if you find out more things, and especially so if you can narrow down that there is a problem with grammY. |
Oh, ok.. Now it is my turn 🚬 |
I think I have found a solution to the issue. The key lies in how Next.js handles server dependencies during build time. Just add the My code: Route // src/app/api/bot/route.ts
import { NextRequest } from 'next/server';
import { Bot, webhookCallback } from 'grammy';
export const POST = async (req: NextRequest, ...args: any[]) => {
const token = process.env.TELEGRAM_TOKEN;
if (!token) throw new Error('TELEGRAM_TOKEN is unset');
const bot = new Bot(token);
bot.command('start', ctx => ctx.reply('Ласкаво просимо! Бот запущений.'));
bot.on('message', ctx => ctx.reply('Отримав ще одне повідомлення!'));
const handleUpdate = webhookCallback(bot, 'std/http', 'throw', 10000);
return handleUpdate(req, ...args);
}; Next.js config // next.config.mjs
/** @type {import('next').NextConfig} */
const nextConfig = {
experimental: {
serverComponentsExternalPackages: ['grammy']
}
};
export default nextConfig; I hope this will be useful to someone who also decides to create a telegram bot w/ grammY and Next.js. |
Interesting stuff, thanks for sharing. By the way, // src/app/api/bot/route.ts
import { NextRequest } from 'next/server';
import { Bot, webhookCallback } from 'grammy';
export const POST = async (req: NextRequest, ...args: any[]) => {
const token = process.env.TELEGRAM_TOKEN;
if (!token) throw new Error('TELEGRAM_TOKEN is unset');
const bot = new Bot(token);
bot.command('start', ctx => ctx.reply('Ласкаво просимо! Бот запущений.'));
bot.on('message', ctx => ctx.reply('Отримав ще одне повідомлення!'));
const handleUpdate = webhookCallback(bot, 'std/http', 'throw', 10000);
return handleUpdate(req, ...args);
}; is a little inefficient because it recreates the bot for every update. This also means that it will have to re-initialize for every update, i.e. call Here is the optimised version: // src/app/api/bot/route.ts
import { Bot, webhookCallback } from 'grammy';
const token = process.env.TELEGRAM_TOKEN;
if (!token) throw new Error('TELEGRAM_TOKEN is unset');
const bot = new Bot(token);
bot.command('start', ctx => ctx.reply('Ласкаво просимо! Бот запущений.'));
bot.on('message', ctx => ctx.reply('Отримав ще одне повідомлення!'));
export const POST = webhookCallback(bot, 'std/http'); |
@KnorpelSenf, thank you |
You helped me a lot! Thank you very much!! |
Perhaps we can add this info to the vercel setup in the example bots repository? @triken22 would you like to take care of that? |
I'm using the following configuration of handling webhooks. While it's working in development (API route build time is under 1.5s), Vercel constantly reports function timeout without response. Some of the commands are using image send from static host with
ctx.replyWithPhoto
. Thestd/http
method is the only one working for me, neitherhttp
/https
, notnext-js
are not working because of different issuesAny suggestions or recommendations?
Thanks in advance!
The text was updated successfully, but these errors were encountered: