Amazon adds GPT-4-beating Claude 3 to Bedrock


Big news in the world of generative AI this morning: not only did San Francisco startup Anthropic release a new large language model (LLM), Claude 3, that appears to be the most powerful in the world to date — besting previous leaders OpenAI’s GPT-4 and Google’s Gemini Advanced on common benchmark tests — but its investor and partner Amazon has already added the new Claude 3 family of models to Bedrock, the Amazon Web Services (AWS) platform for building and running AI services in the cloud.

Anthropic announced three new Claude 3 models today — named Opus, Sonnet and Haiku, in descending order of intelligence. Intriguingly, the new models were trained on synthetic data — that is, data generated by AI itself rather than primarily human authors, which should quell some concerns of model collapse.

Customers of Amazon’s Bedrock AI fully managed service, launched in early 2023 to offer a single application programming interface (API) that customers can use to access multiple models, will now have access to the middle-tier model, Claude 3 Sonnet, starting today, with Opus and Haiku “coming soon.”

5d20371eeb8d045465bb22cacfd269b5958b004d 2200x1174 1
Graph of intelligence vs. cost of Anthropic Claude 3 models. Credit: Anthropic

Pricing for Claude 3 on Bedrock

Amazon has a number of different pricing models available for customers of its Bedrock service, not only for Claude 3 Sonnet but for the multitude of other foundation LLMs available through the managed service. For Claude 3 Sonnet in particular, here’s a snapshot of the pricing taken directly from Amazon’s site.

VB Event

The AI Impact Tour – NYC

We’ll be in New York on February 29 in partnership with Microsoft to discuss how to balance risks and rewards of AI applications. Request an invite to the exclusive event below.

 

Request an invite

Claude 3 Sonnet is more expensive than Claude Instant, an older, smaller and less powerful (but also less compute resource demanding) LLM, while it is less expensive than Claude 2, Anthropic’s previous flagship model, on a per-1,000 token basis.

Screen Shot 2024 03 04 at 12.12.27 PM 2

Access to Claude 3 Sonnet can also be paid for on an hourly basis, where it is more expensive to run than both Claude Instant and Claude 2.

Screen Shot 2024 03 04 at 12.12.46 PM 3

In terms of pricing compared to other available foundation LLMs on Bedrock, Claude 3 Sonnet is among the most expensive on the platform.

Why Claude 3 on Bedrock matters

The addition of Claude 3 is notable because Amazon last year announced a $4 billion investment in Anthropic which is now undergoing scrutiny in the form of a wider investigation of anticompetitive practices in the AI industry by the U.S. Federal Trade Commission.

Yet the Amazon and Anthropic tie-up is hardly exclusive: customers can access Claude 3 outside of Amazon natively on Anthropic’s website, and Amazon offers access to many other LLMs from Bedrock, including those offered by AI21 Labs, Cohere, Meta, Mistral, Stability AI, and Amazon itself.

In fact, Amazon only announced the addition of Mistral’s open-source 7B and Mixtral 8x7B models from the French startup to Bedrock two weeks ago, only to have Mistral announce a brand new, closed model, Mistral Large, and a partnership and investment from Microsoft, which will keep Mistral Large restricted to Amazon’s arch-rival in the cloud wars, Microsoft Azure, and Mistral’s website. The rapid forging of these alliances and AI offerings among the big cloud providers — AWS, Microsoft Azure, and, to a lesser extent, Google Cloud — shows just how competitive the market is becoming for attaching cutting-edge AI models and APIs to cloud services.

Dr. Swami Sivasubramanian, Vice President of Data and AI at AWS, expressed enthusiasm about the collaboration with Anthropic and the potential it unlocks for AWS customers in a blog post today, stating:

Our customers and partners continue to be excited by the advanced applications they can build with Claude on Amazon Bedrock, and the unmatched ability they have to quickly, securely, and responsibly deploy generative AI applications, using differentiated capabilities like knowledge bases, guardrails, and model evaluation, into production to provide new experiences for their end users. The easy access to Claude on Amazon Bedrock has led many of the world’s hottest startups, leading enterprise businesses, and government organizations to choose the managed service for deploying their generative AI applications, and we look forward to this accelerating following today’s news.”

Part of a broader strategy for gen AI leadership

This announcement is part of AWS’s broader strategy to lead the generative AI space by investing across all layers of the generative AI stack—infrastructure, models, and user-facing applications. The company aims to make it easier for customers to leverage AI in a more efficient, extensive, and integrated manner, thereby accelerating innovation and delivering new experiences for end users.

Amazon says more than 10,000 organizations worldwide already using Amazon Bedrock to explore and deploy generative AI applications.

Meanwhile, rumors persist that OpenAI will soon fire back with its own answer to Claude 3, GPT-5, possibly as early as today. We’ll keep you posted on that front.

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.



Source link

About The Author

Scroll to Top