wintersmith-robots
v0.2.6
Published
A Wintersmith plugin for keeping search engines at bay. (Robots.txt)
Downloads
19
Maintainers
Readme
wintersmith-robots
A Wintersmith plugin to generate a Robots.txt file for sitewide and per-page control over indexing.
Install
npm install wintersmith-robots
Add wintersmith-robots
and wintersmith-contents
to your config.json
{
"plugins": [
"wintersmith-contents",
"wintersmith-robots"
]
}
Use
Set sitewide options in Wintersmith's config.json
. If noindex
is set globally, your entire site will be blocked from crawlers.
{
"locals": {
"sitemap": "sitemap.xml",
"noindex": "false"
}
}
Set per-page options at the top of your Markdown files. For instance, you can prevent an article from being indexed like so:
---
noindex: true
---