How to Add Bots to Your Robots.txt File (Ahrefs Example)


A robots.txt file is a text file placed on a website's server that instructs web crawlers and other automated processes which pages or sections of the site should or should not be accessed. It's a protocol used to communicate with web robots and is commonly used for managing website indexing and crawling by search engines like Google, Bing, SEO tools and others.

Adding a Web Robot to the site

Having a web robot crawl a site involves granting permission for it to access and index the site's content.

To allow third-party tools to crawl a website, they must be included in the Allowed list within the website's Robots.txt file.

The ROBOTS.TXT file can be accessed through:

  • Navigate to the Developer Hub sidebar menu link.
  • Click on the Robots File option.

To incorporate into the list, please add the following lines to the beginning of the code:

User-agent: AhrefsSiteAudit
Allow: /

User-agent: AhrefsBot
Allow: /

Lastly, click on "Save Changes". Now, Ahrefs has permission to crawl the website without encountering any blockages.