Correct Magento robots.txt – it’s absolutely main first feature for right Search Engine Optimisation (SEO) of you Magento store. Robots.txt it’s file located in you site root that say to search engines bots/robots how index you site, what pages must be indexed and what – no. Magento it’s very complicated system and have complicated folders and URL structure, this why it’s not easy create correct robot.txt for Magento with not lose any important pages but in same not add trash pages to search engine index. Improved robots.txt it’s right way for build correct SEO for you Magento store and push to search engines index only right pages of you store!
With Improved robots.txt you get world most advanced, optimised and Magento specific robots.txt for you store! In improved robots.txt we provide all our experience in Magnto development and SEO for you can easy get best basis for you Magento SEO!
We’ve decided to provide Magento 1 extensions for free, so you can get this module out of charge here:
In a nutshell the /robots.txt files are special files intended for web robots. They contain special information and instructions. On practice robot tries to visit welcome page but first of all it should go to robots.txt file, get all the instructions and perform them. Some robots can ignore instructions, in most cases it refers to malware. The other important thing about /robots.txt: it is available for everyone.
Every /robots.txt file is situated in the root folder of the website. The file tells search engines and bots what they shouldn’t attempt to index.
robots.txt in E-commerce
The /robots.txt files are very important for ecommerce websites. This simple tool guides search engines to the correct content to index, and as a result you get better site traffic. Also you can limit bandwidth consumption caused by search engines and exclude log files, temporary text files, client downloads and other irrelevant content from search engines sphere. The same is true for bad content and bad links.
Main robots.txt features
Basic Exclusion
Directories Exclusion
Specific Paths
Restrict to Specific User Agents
Multiple Blocks
Basic Wildcards
Certain Parameters Blocks
Whole Filenames Match
You can create all this instructions yourself but it is much easier to use Improved robots.txt extension for your Magento web-store.
Improved robots.txt
Correct robots.txt for Magento is very important basic feature for right Search Engine Optimisation (SEO) of your Magento e-commerce store. Robots.txt’s file is situated in site root and has guidelines for search engines bots/robots how index website, what pages to indexed and what – not to index. Magento is a very complicated system and has a lot of folders and URL structures, so that’s why it is not so easy to create correct robot.txt for the web-store. You always have to remember not to lose any important pages and in the same time not to add trash pages to search engine index. Improved robots.txt is your most simple and reliable tool for building correct SEO for your Magento store and pushing only the right pages of your store to search engines! With Improved robots.txt you get world most advanced, optimised and Magento specific robots.txt for you store! In improved robots.txt we provide all our experience in Magento development and SEO so you can easy get best basis for you store SEO according to the best practices!
Improved robots.txt features
You can load improved robots.txt and edit it easy from Magento backend
Search engines total not crawl Magento admin area
Not crawl Magento system files & folders, technical server scripts
Not crawl sub directories & folders, sort & filet URLs
Not crawl copies/dublicates of pages
Not crawl checkout, cart and customer account pages
Not crawl no search optimised catalog links
Templates for add sitemap URL, images crawl options