Gathering detailed insights and metrics for gatsby-plugin-robots-txt
Gathering detailed insights and metrics for gatsby-plugin-robots-txt
Gathering detailed insights and metrics for gatsby-plugin-robots-txt
Gathering detailed insights and metrics for gatsby-plugin-robots-txt
Gatsby plugin that automatically creates robots.txt for your site
npm install gatsby-plugin-robots-txt
Module System
Min. Node Version
Typescript Support
Node Version
NPM Version
106 Stars
405 Commits
27 Forks
4 Watching
11 Branches
18 Contributors
Updated on 19 Mar 2024
JavaScript (100%)
Cumulative downloads
Total Downloads
Last day
-9.3%
14,609
Compared to previous day
Last week
3.2%
83,225
Compared to previous week
Last month
5.2%
340,789
Compared to previous month
Last year
-13.5%
4,013,345
Compared to previous year
2
1
Create
robots.txt
for your Gatsby site on build.
yarn add gatsby-plugin-robots-txt
or
npm install --save gatsby-plugin-robots-txt
gatsby-config.js
1module.exports = { 2 siteMetadata: { 3 siteUrl: 'https://www.example.com' 4 }, 5 plugins: ['gatsby-plugin-robots-txt'] 6};
This plugin uses generate-robotstxt
to generate content
of robots.txt
and it has the following options:
Name | Type | Default | Description |
---|---|---|---|
host | String | ${siteMetadata.siteUrl} | Host of your site |
sitemap | String / String[] | ${siteMetadata.siteUrl}/sitemap-index.xml | Path(s) to sitemap.xml |
policy | Policy[] | [] | List of Policy rules |
configFile | String | undefined | Path to external config file |
output | String | /robots.txt | Path where to create the robots.txt |
gatsby-config.js
1module.exports = { 2 plugins: [ 3 { 4 resolve: 'gatsby-plugin-robots-txt', 5 options: { 6 host: 'https://www.example.com', 7 sitemap: 'https://www.example.com/sitemap.xml', 8 policy: [{userAgent: '*', allow: '/'}] 9 } 10 } 11 ] 12};
env
-optiongatsby-config.js
1module.exports = { 2 plugins: [ 3 { 4 resolve: 'gatsby-plugin-robots-txt', 5 options: { 6 host: 'https://www.example.com', 7 sitemap: 'https://www.example.com/sitemap.xml', 8 env: { 9 development: { 10 policy: [{userAgent: '*', disallow: ['/']}] 11 }, 12 production: { 13 policy: [{userAgent: '*', allow: '/'}] 14 } 15 } 16 } 17 } 18 ] 19};
The env
key will be taken from process.env.GATSBY_ACTIVE_ENV
first (
see Gatsby Environment Variables for more information on this
variable), falling back to process.env.NODE_ENV
. When this is not available then it defaults to development
.
You can resolve the env
key by using resolveEnv
function:
gatsby-config.js
1module.exports = { 2 plugins: [ 3 { 4 resolve: 'gatsby-plugin-robots-txt', 5 options: { 6 host: 'https://www.example.com', 7 sitemap: 'https://www.example.com/sitemap.xml', 8 resolveEnv: () => process.env.GATSBY_ENV, 9 env: { 10 development: { 11 policy: [{userAgent: '*', disallow: ['/']}] 12 }, 13 production: { 14 policy: [{userAgent: '*', allow: '/'}] 15 } 16 } 17 } 18 } 19 ] 20};
configFile
-optionYou can use the configFile
option to set specific external configuration:
gatsby-config.js
1module.exports = { 2 plugins: [ 3 { 4 resolve: 'gatsby-plugin-robots-txt', 5 options: { 6 configFile: 'robots-txt.config.js' 7 } 8 } 9 ] 10};
robots-txt.config.js
1module.exports = { 2 host: 'https://www.example.com', 3 sitemap: 'https://www.example.com/sitemap.xml', 4 policy: [{userAgent: '*'}] 5};
If you would like to disable crawlers for deploy-previews you can use the following snippet:
gatsby-config.js
1const { 2 NODE_ENV, 3 URL: NETLIFY_SITE_URL = 'https://www.example.com', 4 DEPLOY_PRIME_URL: NETLIFY_DEPLOY_URL = NETLIFY_SITE_URL, 5 CONTEXT: NETLIFY_ENV = NODE_ENV 6} = process.env; 7const isNetlifyProduction = NETLIFY_ENV === 'production'; 8const siteUrl = isNetlifyProduction ? NETLIFY_SITE_URL : NETLIFY_DEPLOY_URL; 9 10module.exports = { 11 siteMetadata: { 12 siteUrl 13 }, 14 plugins: [ 15 { 16 resolve: 'gatsby-plugin-robots-txt', 17 options: { 18 resolveEnv: () => NETLIFY_ENV, 19 env: { 20 production: { 21 policy: [{userAgent: '*'}] 22 }, 23 'branch-deploy': { 24 policy: [{userAgent: '*', disallow: ['/']}], 25 sitemap: null, 26 host: null 27 }, 28 'deploy-preview': { 29 policy: [{userAgent: '*', disallow: ['/']}], 30 sitemap: null, 31 host: null 32 } 33 } 34 } 35 } 36 ] 37};
query
-optionBy default the site URL will come from the Gatsby node site.siteMeta.siteUrl
. Like
in Gatsby's sitemap plugin an optional GraphQL query can be
used to provide a different value from another data source as long as it returns the same shape:
gatsby-config.js
1module.exports = { 2 plugins: [ 3 { 4 resolve: 'gatsby-plugin-robots-txt', 5 options: { 6 query: `{ 7 site: MyCustomDataSource { 8 siteMetadata { 9 siteUrl 10 } 11 } 12 }` 13 } 14 } 15 ] 16};
No vulnerabilities found.
Reason
no binaries found in the repo
Reason
no dangerous workflow patterns detected
Reason
license file detected
Details
Reason
Found 2/4 approved changesets -- score normalized to 5
Reason
detected GitHub workflow tokens with excessive permissions
Details
Reason
0 commit(s) and 0 issue activity found in the last 90 days -- score normalized to 0
Reason
dependency not pinned by hash detected -- score normalized to 0
Details
Reason
no effort to earn an OpenSSF best practices badge detected
Reason
security policy file not detected
Details
Reason
project is not fuzzed
Details
Reason
SAST tool is not run on all commits -- score normalized to 0
Details
Reason
11 existing vulnerabilities detected
Details
Score
Last Scanned on 2024-11-18
The Open Source Security Foundation is a cross-industry collaboration to improve the security of open source software (OSS). The Scorecard provides security health metrics for open source projects.
Learn More