fusion-plugin-robots
v1.0.0
Published
A Fusion.js plugin for to instruct web robots (typically search engine robots) how to crawl pages on their website
Downloads
1
Readme
fusion-plugin-robots
Handles creating and returning a Robots(robots.txt) file for your Fusion app.
Table of contents
Installation
yarn add fusion-plugin-robots
Setup
Create an Const
to use for your robots.js
content.
// src/robots.js
export const Robots = `User-agent: googlebot
Disallow: /directory1/
Disallow: /directory2/
Allow: /directory2/subdirectory1/
# Block the entire site from anothercrawler.
User-agent: anothercrawler
Disallow: /`
`
Register RobotsToken
with your robots File
and register RobotsPlugin
for the __NODE__
env.
// src/main.js
import App from 'fusion-react'
import RobotsPlugin, {
RobotsToken
} from 'fusion-plugin-robots'
import { Robots } from './robots'
export default () => {
const app = new App(root)
if (__NODE__) {
app.register(RobotsToken, Robots)
app.register(RobotsPlugin)
}
return app
}