Robots txt file written on the website optimization

robots file written in

User-agent: *

We see from this

file name suffix, it is.Txt, you should also know that this is a text file, or notepad. Robots, know a little English people should have a person, is the meaning of the robot, robot is the search engine for us on behalf of the robot, from the name you can guess that this document is specially written for we see a spider. Its function is to tell the spider, those columns or those who don’t need to grab the page, also can directly access a spider shield. Note that this file is placed in the root directory of the web site, so as to ensure that the spider can read the contents of the file in the first time.

robots fileIn fact, most often

What is the robots.txt file



robots file we use is the site of the dead link shield. We should know that a website dead link will affect more than the weight of the website. But the site clean up dead link, although it can not be trouble, still need to spend a lot of time, especially if the station dead links more cases, cleaning up very quickly, then the robots file use reflected, we can direct these dead links according to the format to write the document to stop crawling, after cleaning or cleaning slowly. Some web content contains some of the owners do not want to let the spider crawl or URL files, can also direct shielding. For shielding spiders, generally used less.


User-agent: *

said the two paragraph are allowed.

this point should be more important. If the wrong to shield did not succeed, but want to be written in their grasp can be found in time can be a big loss. First of all we need to know two labels, Allow and Disallow, one is allowed, one is not allowed, it compared the role of everyone can understand.

robots.txt file, compared to friends are more or less heard, or you may have written. In fact, so far I haven’t written robots.txt file, not to write, just feel no need to stop what blog content spiders crawl. But surely we can know the probability of occurrence of dead links a personal independence blog should be very small, so I don’t think I need what do not need too much of the dead link processing. But the robots.txt file written as one of the individual owners must master the skills, the use is very extensive. Here described in detail, it is their revision.