Gathering detailed insights and metrics for robots
Gathering detailed insights and metrics for robots
Gathering detailed insights and metrics for robots
Gathering detailed insights and metrics for robots
@nuxtjs/robots
Tame the robots crawling and indexing your Nuxt site with ease.
robots-parser
A specification compliant robots.txt parser with wildcard (*) matching support.
nuxt-simple-robots
Tame the robots crawling and indexing your Nuxt site with ease.
generate-robotstxt
Awesome generator robots.txt
npm install robots
Typescript
Module System
Min. Node Version
Node Version
NPM Version
JavaScript (98.89%)
Makefile (1.11%)
Total Downloads
0
Last Day
0
Last Week
0
Last Month
0
Last Year
0
MIT License
67 Stars
97 Commits
21 Forks
5 Watchers
1 Branches
12 Contributors
Updated on Dec 30, 2024
Latest Version
0.10.1
Package Id
robots@0.10.1
Size
11.05 kB
NPM Version
4.1.2
Node Version
7.5.0
Cumulative downloads
Total Downloads
Last Day
0%
NaN
Compared to previous day
Last Week
0%
NaN
Compared to previous week
Last Month
0%
NaN
Compared to previous month
Last Year
0%
NaN
Compared to previous year
1
robots.js — is parser for robots.txt files for node.js.
It's recommended to install via npm:
1$ npm install -g robots
Here's an example of using robots.js:
1var robots = require('robots') 2 , parser = new robots.RobotsParser(); 3 4parser.setUrl('http://nodeguide.ru/robots.txt', function(parser, success) { 5 if(success) { 6 parser.canFetch('*', '/doc/dailyjs-nodepad/', function (access) { 7 if (access) { 8 // parse url 9 } 10 }); 11 } 12});
Default crawler user-agent is:
Mozilla/5.0 (X11; Linux i686; rv:5.0) Gecko/20100101 Firefox/5.0
Here's an example of using another user-agent and more detailed callback:
1var robots = require('robots') 2 , parser = new robots.RobotsParser( 3 'http://nodeguide.ru/robots.txt', 4 'Mozilla/5.0 (compatible; RobotTxtBot/1.0)', 5 after_parse 6 ); 7 8function after_parse(parser, success) { 9 if(success) { 10 parser.canFetch('*', '/doc/dailyjs-nodepad/', function (access, url, reason) { 11 if (access) { 12 console.log(' url: '+url+', access: '+access); 13 // parse url ... 14 } 15 }); 16 } 17};
Here's an example of getting list of sitemaps:
1var robots = require('robots') 2 , parser = new robots.RobotsParser(); 3 4parser.setUrl('http://nodeguide.ru/robots.txt', function(parser, success) { 5 if(success) { 6 parser.getSitemaps(function(sitemaps) { 7 // sitemaps — array 8 }); 9 } 10});
Here's an example of getCrawlDelay usage:
1 var robots = require('robots') 2 , parser = new robots.RobotsParser(); 3 4 // for example: 5 // 6 // $ curl -s http://nodeguide.ru/robots.txt 7 // 8 // User-agent: Google-bot 9 // Disallow: / 10 // Crawl-delay: 2 11 // 12 // User-agent: * 13 // Disallow: / 14 // Crawl-delay: 2 15 16 parser.setUrl('http://nodeguide.ru/robots.txt', function(parser, success) { 17 if(success) { 18 var GoogleBotDelay = parser.getCrawlDelay("Google-bot"); 19 // ... 20 } 21 });
An example of passing options to the HTTP request:
1var options = { 2 headers:{ 3 Authorization:"Basic " + new Buffer("username:password").toString("base64")} 4} 5 6var robots = require('robots') 7 , parser = new robots.RobotsParser(null, options); 8 9parser.setUrl('http://nodeguide.ru/robots.txt', function(parser, success) { 10 ... 11});
RobotsParser — main class. This class provides a set of methods to read, parse and answer questions about a single robots.txt file.
function callback(access, url, reason) { ... }
where:
access
. Object:
lib/Entry.js:
. Only for types: 'entry', 'defaultEntry'See LICENSE file.
No vulnerabilities found.
Reason
no binaries found in the repo
Reason
license file detected
Details
Reason
0 existing vulnerabilities detected
Reason
Found 11/22 approved changesets -- score normalized to 5
Reason
0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
security policy file not detected
Details
Reason
project is not fuzzed
Details
Reason
branch protection not enabled on development/release branches
Details
Reason
SAST tool is not run on all commits -- score normalized to 0
Details
Score
Last Scanned on 2025-07-07
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More