vercel-grammy
v1.0.0
Published
Utilities for grammY on Vercel
Downloads
258
Maintainers
Readme
grammY helpers for Vercel
Collection of useful methods to run your bot on Vercel
How to ...
Install
npm i vercel-grammy
Import
import {/* methods */} from "vercel-grammy"
Use
import {Bot} from "grammy"
import {getURL} from "vercel-grammy"
const url = getURL({path: "api/index"})
const bot = new Bot(/* token */)
await bot.api.setWebhook(url)
Examples
Get current hostname
// Anywhere in your code
getHost() // *.vercel.app (from `process.env.VERCEL_URL`)
// At your function handler
export default ({headers}) => {
getHost({headers}) // domain.com (from `x-forwarded-host` header)
}
Get URL for current hostname
// Anywhere in your code
getURL({path: "api/index"}) // https://*.vercel.app/api/index
// At your function handler
export default ({headers}) => {
getURL({headers, path: "api/index"}) // https://domain.com/api/index
}
Set webhook for current hostname
// Anywhere in your code
bot.api.setWebhook(getURL({path: "api/index"}))
// As function handler
export default setWebhookCallback(bot, {path: "api/index"})
Use streaming response in webhook handler
Note that this will work only at Vercel Edge Functions
// As function handler
export default webhookStream(bot) // Instead of webhookCallback(bot)
export const config = {
runtime: "edge"
}
Guides
Sets webhook URL automatically
When you deploy a project to Vercel, one of these environments is installed for it:
production
— default formain
ormaster
branchespreview
— for all other branches in your repositorydevelopment
— when using thevercel dev
command
In the early stages of bot development, it is enough to install a webhook
on the main (production) domain, such as project.vercel.app
However, if you want to test new changes without stopping the bot,
then you can simply use a separate (test) bot (for example @awesome_beta_bot
)
and set the webhook to the URL of the branch — project-git-branch-username.vercel.app
But what if you have several separate branches with different changes and want to test them without creating a separate bot for each or manually managing webhooks ?
Q: You didn't make a separate plugin for this, right ?
A: 😏
Q: Didn't do it, right ?
Thanks to the Vercel build step, we can run some code before a new version of the bot is published and no one will stop us from using it
Just add this code to a new JavaScript file:
const {
VERCEL_ENV,
} = process.env
// List of allowed environments
const allowedEnvs = [
"production",
"preview"
]
// Exit in case of unsuitable environments
if (!allowedEnvs.includes(VERCEL_ENV)) process.exit()
// Webhook URL generation
const url = getURL({path: "api/index"})
// Installing a webhook
await bot.api.setWebhook(url)
And specify the path to it in the vercel.json
file:
{
"buildCommand": "node path/to/new/file.js"
}
By the way, you can manage tokens for each environment (or even branch) in the project settings
Avoiding invocation timeouts
By default, Vercel limits the invocation time for your code:
10
seconds for Serverless Functions60
seconds at Pro plan900
seconds at Enterprise plan
25
seconds for Edge Functions1 000
seconds with streaming response
So, without streaming (and paying) you can get up to 25
seconds with default
grammY webhookCallback
adapter at
Edge Functions
On the other hand, we also have a time limit for responding to incoming requests from Telegram — 60
seconds,
after which, the request will be considered unsuccessful and will be retried, which you probably don't want
To get around these limitations you can proxy the request before calling the function by following scheme:
- Telegram sends an update request
- Proxy service passes the original request to your function
- Answer within
60
seconds will be returned to Telegram - Otherwise, proxy responds with a
200
status to prevent a recurrence - Your function may continue to work for the next
940
seconds
Q: What proxy server is suitable for this ?
A: I don't know, but I made it 🙂
Proxy
Source: ProlongRequest
Endpoint: https://prolong-request.fly.dev
Reference:
/domain.com
/http://domain.com
/https://domain.com
/https://domain.com/path/to/file.txt
/https://domain.com/route?with=parameters
Also supports any HTTP methods and transmits raw headers and body
How to use this for bot
Just prepend proxy endpoint to webhook URL:
https://prolong-request.fly.dev/https://*.vercel.app/api/index
Or do it automatically:
const proxy = "https://prolong-request.fly.dev"
const url = getURL({path: "api/index"})
bot.api.setWebhook(`${proxy}/${url}`)
And use streaming response in webhook handler:
export default webhookStream(bot, {
timeoutMilliseconds: 999 // where you can also control timeout
})
export const config = {
runtime: "edge"
}
Limitations
- Processing updates will overlap
- States and sessions will be inconsistent
- Request may break and will not be retried
Benefits
- You can do anything during this time
- You can wait anything within this time
- You can solve anything using this time
API
getHost([options])
options
(object
, optional) — Options for hostnameheaders
(Headers
, optional) — Headers from incoming requestheader
(string
, optional) — Header name which contains the hostnamefallback
(string
, optional) — Fallback hostname (process.env.VERCEL_URL
by default)
- returns
string
— Target hostname
This method generates a hostname from the options passed to it
getURL([options])
options
(object
, optional) — Options for URLpath
(string
, optional) — Path to a function that receives updateshost
(string
, optional) — Hostname without protocol (replacesgetHost
options)...options
(object
, optional) — Options forgetHost
- returns
string
— Target URL
This method generates a URL from the options passed to it
setWebhookCallback(bot[, options])
bot
(Bot
, required) — grammY bot instanceoptions
(object
, optional) — Options for webhooksurl
(string
, optional) — URL for webhooks (replacesgetURL
options)onError
("throw" | "return"
, optional) — Strategy for handling errorsallowedEnvs
(array
, optional) — List of environments where this method allowed...options
(object
, optional) — Options forbot.api.setWebhook
...options
(object
, optional) — Options forgetURL
- returns
() => Promise<Response>
— Target callback method
Callback factory for grammY bot.api.setWebhook
method
webhookStream(bot[, options])
bot
(Bot
, required) — grammY bot instanceoptions
(object
, optional) — Options for streamchunk
(string
, optional) — Content for chunksintervalMilliseconds
(number
, optional) — Interval for writing chunks to stream...options
(object
, optional) — Options forwebhookCallback
- returns
() => Response
— Target callback method
Callback factory for streaming webhook response
jsonResponse(value[, options])
value
(any
, required) — Serializable valueoptions
(object
, optional) — Options for JSON responsereplacer
((string | number)[] | null | undefined
, optional)space
(string | number | undefined
, optional)...options
(ResponseInit
, optional)
- returns
Response
— Target JSON Response
This method generates Response objects for JSON
Templates using this package
- For Vercel Edge Functions
- For Vercel Edge Functions with streaming response
- For Vercel Serverless Functions
Made with 💜 by Vladislav Ponomarev