To guide the bots, update the robots.txt file on the site. Ethical crawlers on the web follow the instructions in the robots.txt file. Use this file to block parts of your site from being crawled, such as the search page. Acquia Search powered by SearchStax cannot prevent unethical site crawling. Unethical site crawling contributes to the use and count of search queries. Bot detection helps against Distributed Denial-of-Service (DDoS) attacks, but it cannot detect a simple crawler that navigates through a site.
Did not find what you were looking for?
If this content did not answer your questions, try searching or contacting our support team for further assistance.