builderall

Create Websites with CHEETAH

Thinking about SEO (8)

ROBOTS.TXT
 

Via the text file "robots.txt" (derived from "Robots-Exclusion-Standard") you can influence which areas of your website (domain) may be crawled by a webcrawler and which not. This file is always located in the root directory of the corresponding domain and is automatically created with the website by Cheetah. Whenever a bot program visits your website, it will first take a look at this special text file and see what it says.
 

The syntax of the robots text file is precisely defined in the Robots Exclusion Standard. If you don't want to mess with it, then it is recommended to use an appropriate generator. You can find a robots.txt generator that is well suited for these purposes at the following link:
 

http://pixelfolk.net/tools/robots


Here you only have to enter your web address and the access path to the XML file with the corresponding sitemap. The latter can be found in Cheetah in the website view under the menu item "Call sitemap". Clicking on it opens the sitemap under a new tab in the browser and you only need to copy its web address from the URL line and transfer it via the clipboard into the corresponding input field in the robots.txt generator.
 

Next, consider - preferably with the help of the sitemap - which pages of the website should be excluded from being indexed by search engine bots. These can be, for example, pages that are used to request e-mail addresses, possible thank-you pages or certain unimportant pages that you do not want to have in a search engine index.

From these pages you write down the "folder name" (the last section of the URL after the last "/") and enter it in the list "Do not index folders and pages", where the folder name (as shown in the defaults) must be terminated with a "/".
 

Note: Please always include the "home" page in this list as well. In the sitemap that Builderall generates from your website, the root page is always missing. Instead, the page ".../home" is entered as "homepage". Since both addresses naturally lead to the same content, the crawler detects "duplicate content" here, which has a negative effect on the evaluation of the website. If, on the other hand, you explicitly exclude the ".../home" page from crowding, this no longer happens.
 

Furthermore, you can block various bot programs (called "spiders" in technical jargon) directly. Here it is recommended to simply keep the default settings of the generator.
 

Once all settings are complete, click on the "Create Robots.txt" button and its contents will be displayed under a new page in the browser. You can now select it with the mouse and then copy it to the Windows clipboard.
 

Now switch back to the corresponding website view in Cheetah and there use the left side menu to go to the "SEO settings" section. There you can copy the content of the clipboard into the input area "Robots" and save the settings. Then republish the website. And that's all.
 

Note: If you are familiar with the corresponding command syntax, you can of course enter the corresponding instructions here manually. A corresponding compilation of the commands can be found, for example, in the Wikipedia article with the keyword "Robots Exclusion Standard".

TB Amazon