Amazon has announced a substantial investment of up to $4 billion in Anthropic, an artificial intelligence (AI) startup, marking its strategic foray into the burgeoning generative AI market. This decision comes amid growing competition in this sector, dominated by OpenAI's ChatGPT.
In this partnership, Anthropic will harness the power of Amazon Web Services (AWS), employing AWS Trainium and Inferentia chips for the development, training, and deployment of its future foundation models. By leveraging AWS's infrastructure, Anthropic aims to benefit from its price efficiency, high performance, scalability, and robust security features. Furthermore, the collaboration extends to the joint development of upcoming Trainium and Inferentia technology.
Amazon CEO Andy Jassy expressed enthusiasm about the collaboration, emphasizing the potential of AWS's offerings such as Amazon Bedrock, a managed service enabling companies to build generative AI applications on foundational models. AWS Trainium, Amazon's AI training chip, complements this effort. AWS will become Anthropic's primary cloud provider for mission-critical workloads, including safety research and the evolution of foundation models.
Anthropic intends to rely predominantly on AWS for running its workloads, tapping into the advanced cloud technology offered by the world's leading cloud provider.
The collaboration goes beyond financial investments. Amazon's developers and engineers will have access to Anthropic models through Amazon Bedrock, allowing them to integrate generative AI capabilities into their projects, enhance existing applications, and create entirely new customer experiences across Amazon's diverse business ventures.
Dario Amodei, co-founder and CEO of Anthropic, expressed excitement about using AWS's Trainium chips for developing future foundation models. This extended partnership aims to unlock fresh possibilities for organizations of all sizes by combining Anthropic's cutting-edge AI systems with AWS's state-of-the-art cloud technology.
One of Anthropic's flagship models, Claude 2, exhibits remarkable performance, scoring above the 90th percentile on the GRE reading and writing exams, as well as quantitative reasoning.
In terms of its cloud offerings, AWS continues to provide a wide array of computing options, including NVIDIA compute instances and its proprietary silicon chips, AWS Trainium for AI training, and AWS Inferentia for AI inference. In the middle layer, AWS focuses on providing customers with an extensive selection of foundation models from various leading providers.
ALSO READ: Apple Watch Series 9: The ultimate health and connectivity companion
As part of this partnership, customers will gain early access to features allowing them to customize Anthropic models, create private models using proprietary data, and utilize fine-tuning capabilities through a self-service feature within Amazon Bedrock.
Inputs from IANS