Correct robots.txt for Magento is a very important basic feature for right Search Engine Optimisation (SEO) of your Magento e-commerce store. Robots.txt file is situated in site root and has guidelines for search engines bots/robots how index website, what pages to indexed and what - not to index. Magento is a very complicated system and has a lot of folders and URL structures, so that’s why it is not so easy to create correct robot.txt for the web-store. You always have to remember not to lose any important pages and in the same time not to add trash pages to search engine index. Improved robots.txt is your most simple and reliable tool for building correct SEO for your Magento store and pushing only the right pages of your store to search engines! With Improved robots.txt you get most advanced, optimised and Magento specific robots.txt for you store! In improved robots.txt we provide all our experience in Magento development and SEO so you can easy get best basis for you store SEO according to the best practices!
Improved robots.txt features
- You can load improved robots.txt and edit it easy from Magento backend
- Search engines don’t not crawl Magento admin area with Improved robots.txt
- The same is true to
- Magento system files & folders, technical server scripts
- sub directories & folders, sort & file URLs
- copies/duplicates of pages
- checkout, cart and customer account pages
- search optimised catalog links
- You also get templates for sitemap URL addition and images crawl options
With Improved robots.txt you get world most advanced, optimised and Magento specific robots.txt for you store! In improved robots.txt we provide all our experience in Magnto development and SEO for you can easy get best basis for you Magento SEO!