Gathering detailed insights and metrics for robotstxt-util
Gathering detailed insights and metrics for robotstxt-util
Gathering detailed insights and metrics for robotstxt-util
Gathering detailed insights and metrics for robotstxt-util
RFC 9309 spec compliant robots.txt builder and parser. 🦾 No dependencies, fully typed.
npm install robotstxt-util
Typescript
Module System
Node Version
NPM Version
TypeScript (92.21%)
JavaScript (7.79%)
Total Downloads
0
Last Day
0
Last Week
0
Last Month
0
Last Year
0
MIT License
3 Stars
12 Commits
1 Forks
1 Watchers
2 Branches
1 Contributors
Updated on Sep 14, 2024
Latest Version
4.1.0
Package Id
robotstxt-util@4.1.0
Unpacked Size
39.15 kB
Size
6.47 kB
File Count
9
NPM Version
10.7.0
Node Version
20.14.0
Published on
Sep 14, 2024
Cumulative downloads
Total Downloads
Last Day
0%
NaN
Compared to previous day
Last Week
0%
NaN
Compared to previous week
Last Month
0%
NaN
Compared to previous month
Last Year
0%
NaN
Compared to previous year
RFC 9309 spec compliant robots.txt builder and parser. 🦾 No dependencies, fully typed.
Before using this library, I recommend you to read the following guide by Google: https://developers.google.com/search/docs/crawling-indexing/robots/intro
Note to myself (and contributors): https://www.rfc-editor.org/rfc/rfc9309.html
1npm i robotstxt-util
Exports a parser parseRobotsTxt
and an object RobotsTxt
to create and manage robots.txt data.
1import { RobotsTxt } from 'robotstxt-util' 2 3const robotstxt = new RobotsTxt() 4 5const allBots = robotstxt.newGroup('*') 6allBots.disallow('/') 7 8const googleBot = robotstxt.newGroup('googlebot') 9googleBot.allow('/abc') 10googleBot.disallow('/def').disallow('/jkl') 11 12// specify multiple bots 13const otherBots = robotstxt.newGroup(['abot', 'bbot', 'cbot']) 14googleBot.allow('/qwe') 15// specify custom rules 16googleBot.addCustomRule('crawl-delay', 10) 17 18// add sitemaps 19robotstxt.add('sitemap', 'https://yoursite/sitemap.en.xml') 20robotstxt.add('sitemap', 'https://yoursite/sitemap.tr.xml') 21 22// and export 23const json = robotstxt.json() 24const txt = robotstxt.txt()
Parses the data and returns instance of RobotsTxt
:
1import { parseRobotsTxt } from 'robotstxt-util' 2 3const data = ` 4# hello robots 5 6User-Agent: * 7Disallow: *.gif$ 8Disallow: /example/ 9Allow: /publications/ 10 11User-Agent: foobot 12Disallow:/ 13crawl-delay: 10 14Allow:/example/page.html 15Allow:/example/allowed.gif 16 17# comments will be stripped out 18 19User-Agent: barbot 20User-Agent: bazbot 21Disallow: /example/page.html 22 23Sitemap: https://yoursite/sitemap.en.xml 24Sitemap: https://yoursite/sitemap.tr.xml 25` 26const robotstxt = parseRobotsTxt(data) 27 28// update something in some group 29robotstxt.findGroup('barbot').allow('/aaa').allow('/bbb') 30 31// store as json or do whatever you want 32const json = robotstxt.json()
If you're interested in contributing, read the CONTRIBUTING.md first, please.
Thanks for watching 🐬
No vulnerabilities found.
Reason
no binaries found in the repo
Reason
license file detected
Details
Reason
Found 0/12 approved changesets -- score normalized to 0
Reason
0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0
Reason
no SAST tool detected
Details
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
security policy file not detected
Details
Reason
project is not fuzzed
Details
Reason
branch protection not enabled on development/release branches
Details
Reason
19 existing vulnerabilities detected
Details
Score
Last Scanned on 2025-07-07
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More