robots-util
v0.1.2
Published
Parses the robots.txt file and other util
Downloads
206
Readme
Parse Robots
Parse robots.txt files
Parses the robots.txt file and other util.
Designed so that the sitemap plugin can write the sitemap URL to the robots.txt without string manipulation.
Install
npm instll robots-util
yarn add robots-util
API
RobotsLine
Represents a parsed line in a robots.txt file.
RobotsLine
new RobotsLine(line, index[, key][, value])
Create a RobotsLine.
line
String the raw line input.index
Number the zero-based line number.key
String the declaration key.value
String the declaration value.
.parse
RobotsLine.prototype.parse()
Parse the line into this instance.
Returns this line instance.
.serialize
RobotsLine.prototype.serialize()
Get a serialized line from the current state.
Returns a string line value.
RobotsParser
Parse and serialize a robots.txt file.
Designed so that the serialized output has a 1:1 relationship with the
source document but allows inspecting and modifying the key
and value
properties for each line.
.parse
RobotsParser.prototype.parse(content)
Parse the robots.txt file content.
User-Agent:
Disallow: /private/ # does not block indexing, add meta noindex
Becomes:
[
{
key: 'User-Agent',
value: '*',
lineno: 1,
line: 'User-Agent: *'
},
{
key: 'Disallow',
value: '/private/',
lineno: 2,
line: 'Disallow: /private/ # does not block indexing, add meta noindex',
comment: '# does not block indexing, add meta noindex'
}
]
Returns an array of line objects.
content
String the robots.txt file content.
.serialize
RobotsParser.prototype.serialize(list)
Serialize the robots.txt declaration list.
Returns a string of robots.txt file content.
list
Array the parsed robots.txt declaration list.
License
MIT