Robots.txt Tips For Deailing With Bots

Posted by aweber on Jan 13, 2010 8:33 AM EDT
BeginLinux.com/blog; By Mike Weber
Mail this story
Print this story

A few tips I put together while re-creating the robots.txt file on my Linux web server. The robots.txt is used to provide crawling instructions to web robots using the Robots Exclusion Protocol. When a web robots visits your site it will check this file, robots.txt, to discover any directories or pages you want to exclude from the web robot listing on the search engine. This is an important file which determines SEO for search engines and can help rankings.

Full Story

  Nav
» Read more about: Story Type: Tutorial; Groups:

« Return to the newswire homepage

This topic does not have any threads posted yet!

You cannot post until you login.