@nuxtjs/robots
v5.0.1
Published
Tame the robots crawling and indexing your Nuxt site with ease.
Downloads
390,325
Readme
Nuxt Robots is a module for configuring the robots crawling your site with minimal config and best practice defaults.
The core feature of the module is:
- Telling crawlers which paths they can and cannot access using a robots.txt file.
- Telling search engine crawlers what they can show in search results from your site using a
<meta name="robots" content="index">
X-Robots-Tag
HTTP header.
New to robots or SEO? Check out the Controlling Web Crawlers guide to learn more about why you might need these features.
Features
- 🤖 Merge in your existing robots.txt or programmatically create a new one
- 🗿 Automatic
X-Robots-Tag
header and<meta name="robots" ...>
meta tag - 🔄 Integrates with route rules and runtime hooks
- 🔒 Disables non-production environments from being indexed
- Solves common issues and best practice default config
Installation
💡 Using Nuxt 2? Please use the v3.x tag.
Install @nuxtjs/robots
dependency to your project:
npx nuxi@latest module add robots
💡 Need a complete SEO solution for Nuxt? Check out Nuxt SEO.
Documentation
📖 Read the full documentation for more information.
Demos
Sponsors
License
Licensed under the MIT license.