Gathering detailed insights and metrics for express-robots-txt
Gathering detailed insights and metrics for express-robots-txt
Gathering detailed insights and metrics for express-robots-txt
Gathering detailed insights and metrics for express-robots-txt
npm install express-robots-txt
Module System
Min. Node Version
Typescript Support
Node Version
NPM Version
15 Stars
97 Commits
5 Forks
3 Watching
4 Branches
7 Contributors
Updated on 14 Sept 2024
JavaScript (100%)
Cumulative downloads
Total Downloads
Last day
-4.7%
3,583
Compared to previous day
Last week
7.7%
21,607
Compared to previous week
Last month
-0.4%
89,497
Compared to previous month
Last year
51.3%
1,006,416
Compared to previous year
Express middleware for generating a robots.txt or responding with an existing file.
1// es6 usage requires a version of node which supports esm_conditional_exports, see 2// https://nodejs.org/api/esm.html#esm_conditional_exports 3import express from 'express' 4import robots from 'express-robots-txt' 5 6// commonjs 7const express = require('express') 8const robots = require('express-robots-txt') 9 10const app = express() 11const port = 3000 12 13app.use(robots({ 14 UserAgent: '*', 15 Disallow: '/', 16 CrawlDelay: '5', 17 Sitemap: 'https://nowhere.com/sitemap.xml', 18})) 19 20app.listen(port)
1app.use(robots(__dirname + '/robots.txt'));
1app.use(robots({ 2 UserAgent: '*', 3 Disallow: '/' 4}))
User-agent: *
Disallow: /
You can optionally pass a CrawlDelay in just like passing in Disallow
1app.use(robots({ 2 UserAgent: '*', 3 Disallow: '/', 4 CrawlDelay: '5' 5}))
User-agent: *
Disallow: /
Crawl-delay: 5
You can optionally pass a Sitemap in just like passing in CrawlDelay:
1app.use(robots({ 2 UserAgent: '*', 3 Disallow: '/', 4 CrawlDelay: '5', 5 Sitemap: 'https://nowhere.com/sitemap.xml' 6}))
User-agent: *
Disallow: /
Crawl-delay: 5
Sitemap: https://nowhere.com/sitemap.xml
You can also pass in an array of Sitemaps:
1app.use(robots({ 2 UserAgent: '*', 3 Disallow: '/', 4 CrawlDelay: '5', 5 Sitemap: [ 6 'https://nowhere.com/sitemap.xml', 7 'https://nowhere.com/sitemap2.xml' 8 ] 9}))
User-agent: *
Disallow: /
Crawl-delay: 5
Sitemap: https://nowhere.com/sitemap.xml
Sitemap: https://nowhere.com/sitemap2.xml
1app.use(robots([ 2 { 3 UserAgent: 'Googlebot', 4 Disallow: '/no-google' 5 }, 6 { 7 UserAgent: 'Bingbot', 8 Disallow: '/no-bing' 9 } 10]));
User-agent: Googlebot
Disallow: /no-google
User-agent: Bingbot
Disallow: /no-bing
1app.use(robots([ 2 { 3 UserAgent: [ 'Googlebot', 'Slurp' ], 4 Disallow: [ '/no-google', '/no-yahoo' ] 5 }, 6 { 7 UserAgent: '*', 8 Disallow: [ '/no-bots', '/still-no-bots' ] 9 } 10]));
User-agent: Googlebot
User-agent: Slurp
Disallow: /no-google
Disallow: /no-yahoo
User-agent: *
Disallow: /no-bots
Disallow: /still-no-bots
You can optionally pass a Host in just like passing in Disallow. Note that some crawlers may not support this drective.
1app.use(robots({ 2 UserAgent: '*', 3 Disallow: '/', 4 CrawlDelay: '5', 5 Host: 'foobar.com' 6}))
User-agent: *
Disallow: /
Crawl-delay: 5
Host: foobar.com
Originally forked from weo-edu/express-robots.
No vulnerabilities found.
Reason
no binaries found in the repo
Reason
no dangerous workflow patterns detected
Reason
Found 0/8 approved changesets -- score normalized to 0
Reason
0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0
Reason
detected GitHub workflow tokens with excessive permissions
Details
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
dependency not pinned by hash detected -- score normalized to 0
Details
Reason
license file not detected
Details
Reason
security policy file not detected
Details
Reason
project is not fuzzed
Details
Reason
SAST tool is not run on all commits -- score normalized to 0
Details
Reason
17 existing vulnerabilities detected
Details
Score
Last Scanned on 2024-11-18
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More