Gathering detailed insights and metrics for robots-txt-component
Gathering detailed insights and metrics for robots-txt-component
Gathering detailed insights and metrics for robots-txt-component
Gathering detailed insights and metrics for robots-txt-component
Fully native robots.txt parsing component without any dependencies.
npm install robots-txt-component
Typescript
Module System
Node Version
NPM Version
JavaScript (100%)
Total Downloads
0
Last Day
0
Last Week
0
Last Month
0
Last Year
0
MIT License
16 Commits
1 Watchers
1 Branches
1 Contributors
Updated on Oct 02, 2022
Latest Version
0.0.5-dev
Package Id
robots-txt-component@0.0.5-dev
Unpacked Size
17.13 kB
Size
4.80 kB
File Count
9
NPM Version
6.14.15
Node Version
14.18.3
Cumulative downloads
Total Downloads
Last Day
0%
NaN
Compared to previous day
Last Week
0%
NaN
Compared to previous week
Last Month
0%
NaN
Compared to previous month
Last Year
0%
NaN
Compared to previous year
1
Lightweight robots.txt parsing component without any external dependencies for Node.js.
Via NPM
npm install robots-txt-component --save
Before using the parser, you need to initialize it like below:
1const RobotsParser = require('robots-txt-component'); 2... 3 4let robots = new RobotsParser('https://www.example.com', true) // allowOnNeutral = true 5await robots.init() // Will attempt to retrieve the robots.txt file natively and parse it.
1let robots = new RobotsParser('https://www.example.com', true) // # allowOnNeutral = true 2await robots.init() // # Will attempt to retrieve the robots.txt file natively and parse it. 3 4 5userAgent = '*' 6if (robots.canCrawl(url, userAgent)) { // # Will check the url against all the rules found in robots.txt file 7 // # Your logic 8} 9 10// # get the crawl delay for a user agent 11let crawlDelay = robots.getCrawlDelay('Bingbot') 12 13// # get raw robots.txt content 14let content = robots.getRawContent()
constructor(url, allowOnNeutral = true, rawRobotsTxt = null)
url
: domain of which robots.txt file you want to use.
allowOnNeutral
: if the same amount of allows and disallows exist for the a url, do we allow or disallow ?
rawRobotsTxt
: if you already have retrieved the raw robots.txt content, provide it here.
async init()
void
Must be called and awaited before performing any other action.
This method attempts to retrieve the robots.txt file from the provided url.
getRawContent()
string | null
This method returns the raw robots.txt file content.
canCrawl(url, userAgent)
boolean
Checks the rules for the url path and the user agent in the robots.txt file and returns access policy.
getCrawlDelay(userAgent)
integer
Checks if crawl-delay
is defined for the user agent and returns it if defined, if not, returns 0
.
MIT License
Copyright (c) 2020 Ibragim Abubakarov
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
No vulnerabilities found.
Reason
no binaries found in the repo
Reason
0 existing vulnerabilities detected
Reason
license file detected
Details
Reason
Found 0/15 approved changesets -- score normalized to 0
Reason
0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
security policy file not detected
Details
Reason
project is not fuzzed
Details
Reason
branch protection not enabled on development/release branches
Details
Reason
SAST tool is not run on all commits -- score normalized to 0
Details
Score
Last Scanned on 2025-07-07
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More