Can anyone elaborate what actually robots.txt file of a blog website?
Simply a file using which you can tell search engine bots what to index and what not .
Like, If I want that Google does not index a category like /users then I can simply add it in my Robots.txt as
The robots.txt file is used to set the crawling instruction for respecful bots like GoogleBot. This file is recommended to place at the root public html directory.
For example, in WordPress we can have …
Sitemap: https://www.example.com/sitemap_index.xml User-agent: * Disallow: /?s=* Disallow: /wp-admin/ Allow: /wp-admin/admin-ajax.php Allow: /wp-admin/images/ Allow: /wp-admin/css/ Allow: /wp-admin/js/
Line by line Explanation
- Sitemap declared
- For all user-agents
- Blocked Search Results
- Blocked administrative path
- Allowed ajax theme elements
- Allowed wp-admin images path for Embed content
- Allowed wp-admin CSS
- Allowed wp-admin JS
[The robots.txt rule is written as per Google Mobile Friendly guidelines which says, do not block CSS and JS path.]
Apart from this, do you have any related specific question? I would be happy to answer.
Should I press Enter Button of the keyboard to put another command in next line?
Also my admin dashboard link is different than wp-admin
Should I keep “Disallow: /wp-admin/” or change it?
- Yes, all rule should be written per line
- No need to add changed wp-admin path