A Wintersmith plugin to generate a Robots.txt file for sitewide and per-page control over indexing.
npm install wintersmith-robots
Add wintersmith-robots
and wintersmith-contents
to your config.json
{
"plugins": [
"wintersmith-contents",
"wintersmith-robots"
]
}
Set sitewide options in Wintersmith's config.json
. If noindex
is set globally, your entire site will be blocked from crawlers.
{
"locals": {
"sitemap": "sitemap.xml",
"noindex": "false"
}
}
Set per-page options at the top of your Markdown files. For instance, you can prevent an article from being indexed like so:
---
noindex: true
---