What is monsidobot?¶
Monsidobot is the name of the Acquia Optimize web crawler. It is what we use to do the scan on websites, as requested by clients as part of our offering.
This article gives the answers to several frequently asked questions regarding our crawler.
Why does monsidobot make a request even though the link is disallowed in robots.txt?¶
The request is made because the purpose is to verify the status of the link. Unless you are a client, we have no interest in the content of Your website and will not save nor use it in any way outside of the response code for verification of the status of the link.
I’m not a client. Why am I seeing requests from monsidobot?¶
This is happening as one or more of our clients has links to your website and we are making requests to determine if any links that they have on their site are still valid.
For non-client links, the crawler will try to make a HEAD request in order to be as resource-frugal as possible. Unfortunately, not all websites support HEAD, and so for a range of non 2XX response codes, the crawler tries to determine if the link is actually in a non-functional state or if the website does not support HEAD requests.
Additional resources¶
Monsidobot generally respects crawl-delay in robots.txt as long as the value is between 0 and 60. 60 will be used for any higher value. For any other questions or concerns please
contact usIP Addresses that we use¶
- USA and Canada 35.226.117.128
- UK and Europe 130.211.65.55
- Australia and New Zealand 35.189.0.46
Ports that we use¶
- 443 for HTTPS
- 80 for HTTP.
The User Agent¶
The user agent details are: