env-robotstxt
v1.0.4
Published
generate robots.txt based on env content for static websites
Downloads
25
Maintainers
Readme
Feature
- Set
env variable
to generaterobots.txt
file - Generate
Disallow: /
by default (without env variable) - Allow generate in to different folder
- Easy to config robots.txt if you use
git
(different branch different robots content) - For all static website projects, like html, vue, react, angular, php, etc.
Install
npm install env-robotstxt
# or
yarn add env-robotstxt
Config
Add the robot
(or your preferred command) command in scripts object:
"scripts": {
"robot": "npx env-robotstxt" // default
},
"scripts": {
"robot": "npx env-robotstxt dist" // generate into `dist` folder
},
then set up the env variable ENV_ROBOTS_CONTENT
:
ENV_ROBOTS_CONTENT="User-agent: *\nDisallow: \nSitemap: https://domain.name/sitemap.xml"
Run
npm run robot
or
yarn robot