How to create a correct robots.txt file in GatsbyJS

Use the gatsby-plugin-robots-txt along with our guide to submit a valid robot.txt file.

Oscar de la Hera Gomez
Written by Oscar de la Hera Gomez
First published on 12/23/2023 at 05:31
Last Updated on 12/23/2023 at 07:52
<p>Two flowers that represent GatsbyJS and Robots. Beneath them sits the text "Robots.txt."</p>

Use the gatsby-plugin-robots-txt along with our guide to submit a valid robot.txt file.

SubscribeDownload Open Source GatsbyJS Project
<p>A screenshot of PageSpeed Insights that shows that I wrote a wrong robots.txt file.</p>

The following article walks you through how to create a correct robots.txt file using GatsbyJS and the gatsby-plugin-robots-txt plugin that produces no errors and which elevates your SEO.

Step One: Add the gatsby-plugin-robots-txt plugin

<p>A screenshot of GatsbyJS's webpage for the gatsby-plugin-robots.txt</p>

In Terminal, set the current directory to that of your GatsbyJS project and add the dependency by running the following line:

yarn add gatsby-plugin-robots-txt

The gatsby-plugin-robots-txt plugin will create a robots.txt at the right location within your website.

Learn more about the gatsby-plugin-robots.txt

Step Two: Configure the plugin

<p>A screenshot of VSCode showing the configuration that we used for the robots.txt plugin.</p>

In the gatsby-config.js, add a configuration for the gatsby-plugin-robots.txt similar to the one below.

Please note that you need to update the sitemap url to point to the correct URL.

How to fix "Sitemap is not accessible" on Ahrefs in GatsbyJS

To learn alternative ways for writing valid robots.txt rules consult the article below.

How to write and submit a correct robots.txt

For an example of using two user-agents, please consult the snippet below. The same pattern can be repeated to add more user-agents.

Step Three: Test

<p>A screenshot of PageSpeed Insights showing that we resolved the robots.txt issue.</p>

Build and deploy your website and confirm that PageSpeed Insights gives no robots.txt error.

Visit PageSpeed Insights

Looking to learn more about ReactJS, GatsbyJS or SEO ?

Search our blog to find educational content on learning SEO as well as how to use ReactJS and GatsbyJS.

Search our Blog

Any Questions?

We are actively looking for feedback on how to improve this resource. Please send us a note to inquiries@delasign.com with any thoughts or feedback you may have.
SubscribeContact UsVisit our BlogView our ServicesView our Work

Partner with us

We would love to get to know you and see how we can help your organization with its goals and needs.
Let's Talk

Stay Informed

Get occasional updates about our company, research, and product launches.
Subscribe