If I use Acquia Search powered by SearchStax, and experience a significant amount of bot traffic crawling through search, does it impact my tier or number of queries?
To guide the bots, update the robots.txt file on your site. Ethical web crawlers follow the instructions in this file, which can be used to block parts of your site from being crawled, such as the search page. Acquia Search powered by SearchStax cannot prevent unethical site crawling, which contributes to the use and count of search queries. While bot detection helps guard against Distributed Denial-of-Service (DDoS) attacks, it cannot detect a simple crawler navigating the site.
To further prevent search engines from indexing your search pages, add your search page parameters and URLs to your robots.txt
exclusion file or include a noindex robots meta tag on your search page. This directive instructs search engines not to index search result pages that match the URL patterns or include the inline tag.
Did not find what you were looking for?
If this content did not answer your questions, try searching or contacting our support team for further assistance.