Description
The Robots.txt file is a simple text file that is placed in your site’s root directory. This file uses a set of instructions to tell search engine robots which pages on your website they can and cannot crawl.When activated, this plugin will create a robots.txt file and add the following rules:
User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Disallow: /wp-content/plugins/ Disallow: /wp-content/themes/ Allow: /This will tell bots (search engines) NOT to crawl wp-admin, wp-includes wp-content/themes, and wp-content/plugins directories.From Settings > robots.txt page you can view content and edit it to suite your needs. For example:To block Bing search engine:
User-agent: Bingbot Disallow: /To stop bots from crawling wp/login.php
User-agent: * Disallow: /wp-login.phpTo stop bots from crawling you search results:
User-agent: * Disallow: /?s= Disallow: /search/