Most commonly larger search engine optimization (SEO) providers depend on regularly scraping keywords from search engines to monitor the competitive position of their customers' websites for relevant keywords or their indexing status.
The first layer of defense is a captcha page[4] where the user is prompted to verify they are a real person and not a bot or tool.
This sort of block is likely triggered by an administrator and only happens if a scraping tool is sending a very high number of requests.
PHP is a commonly used language to write scraping scripts for websites or backend services, since it has powerful capabilities built-in (DOM parsers, libcURL); however, its memory usage is typically 10 times the factor of a similar C/C++ code.
Additionally, bash scripting can be used together with cURL as a command line tool to scrape a search engine.