A technical workshop, attended by the participants and invited experts, has been held in London to discuss the use cases and agree next steps.
ACAP rules can be considered as an extension to the Robots Exclusion Standard (or "robots.txt") for communicating website access information to automated web crawlers.
It has been suggested[8] that ACAP is unnecessary, since the robots.txt protocol already exists for the purpose of managing search engine access to websites.
As an early priority, ACAP is intended to provide a practical and consensual solution to some of the rights-related issues which in some cases have led to litigation[11][12] between publishers and search engines.
Only one, Exalead, ever confirmed that they will be adopting the standard,[15] but they have since ceased functioning as a search portal to focus on the software side of their business.