Googlebot is a crawler used by the Google.Its a kind of software which collects details from that webpage and index the web page into search engine.Sometime is called a spider, it discovers new and updated pages to be added to the Google index.
- Its a robots that finds and fetches the webpages form a website.
- If you want to restrict the information on their site available to a Googlebot,you can do with the robots.txt file to allow or disallow.
- Googlebot only follows Href links--which indicates the URL.
1 post • Page 1 of 1
Who is online
Users browsing this forum: No registered users and 3 guests