[5][6][7] Anthropic has developed a family of large language models (LLMs) named Claude as a competitor to OpenAI's ChatGPT and Google's Gemini.
[16][3] In the summer of 2022, Anthropic finished training the first version of Claude but did not release it, mentioning the need for further internal safety testing and the desire to avoid initiating a potentially hazardous race to develop increasingly powerful AI systems.
[14] Anthropic incorporated itself as a Delaware public-benefit corporation (PBC), which requires the company to maintain a balance between private and public interests.
[53] In November 2024, Palantir announced a partnership with Anthropic and Amazon Web Services to provide U.S. intelligence and defense agencies access to Claude 3 and 3.5.
[47] For example, one rule from the UN Declaration applied in Claude 2's CAI states "Please choose the response that most supports and encourages freedom, equality and a sense of brotherhood.
Using a compute-intensive technique called "dictionary learning", Anthropic was able to identify millions of features in Claude, including for example one associated with the Golden Gate Bridge.
Anthropic partnered with Palantir and Amazon Web Services in November 2024 to provide the Claude model to U.S. intelligence and defense agencies.
[62] Anthropic's CEO Dario Amodei said about working with the U.S. military:The position that we should never use AI in defense and intelligence settings doesn’t make sense to me.
[63]On October 18, 2023, Anthropic was sued by Concord, Universal, ABKCO, and other music publishers for, per the complaint, "systematic and widespread infringement of their copyrighted song lyrics.
[67] In the lawsuit, the plaintiffs support their allegations of copyright violations by citing several examples of Anthropic's Claude model outputting copied lyrics from songs such as Katy Perry's "Roar" and Gloria Gaynor's "I Will Survive".
[67] Additionally, the plaintiffs alleged that even given some prompts that did not directly state a song name, the model responded with modified lyrics based on original work.
The suit claims Anthropic fed its LLMs with pirated copies of the authors' work, including from participants Kirk Wallace Johnson, Andrea Bartz and Charles Graeber.