Amazon has officially launched Bedrock, a service that provides access to a variety of generative AI models from both Amazon and third-party partners through an API. This offering, which was initially introduced in early April, empowers AWS customers to develop applications utilizing generative AI models while allowing customization with their own proprietary data. By leveraging these models, brands and developers can also create AI “agents” that automatically handle tasks such as booking travel, managing inventory, and processing insurance claims.
In the coming weeks, Llama 2, Meta's open-source large language model, will be integrated into Bedrock, joining the robust lineup that includes models from AI21 Labs, Anthropic, Cohere, and Stability AI. Amazon claims that Bedrock will be the first service to fully manage the Llama 2 models, specifically the 13-billion- and 70-billion-parameter variations. (Parameters reflect the information learned during training and determine the model's capability in tasks like text generation.) It should be noted, however, that Llama 2 has already been accessible on several other cloud-based generative AI platforms, including Google’s Vertex AI.
In many respects, Bedrock parallels Vertex AI, which also offers its selection of both first-party and third-party models for developing generative AI applications. Nevertheless, Swami Sivasubramanian, VP of Data and AI at AWS, posits that Bedrock holds an edge due to its seamless integration with existing AWS services, such as AWS PrivateLink, which enables secure connectivity between Bedrock and a company's virtual private cloud.
While this claims some merit, it may be more perception than objective fact, as the advantage can vary based on the specific customer and their existing cloud setup. Naturally, Sivasubramanian is unlikely to admit this perspective.
“Over the past year, the rapid growth of data, scalable computing, and advancements in machine learning have significantly heightened interest in generative AI, leading to innovations that may revolutionize industries and reshape workflows,” stated Sivasubramanian in a press release. “Today’s launch marks a pivotal moment, bringing generative AI within reach of businesses of all sizes, from startups to enterprises, and catering to every role, from developers to data analysts.”
In other news today, Amazon has also introduced the Titan Embeddings model, its proprietary solution that transforms text into numerical representations known as embeddings, enhancing search and personalization applications. This Titan Embeddings model supports approximately 25 languages and accommodates text chunks or entire documents of up to 8,192 tokens (around 6,000 words), matching capabilities of OpenAI's latest embeddings model.
Bedrock's journey has not been entirely smooth. Bloomberg reported in May that, six weeks after Amazon showcased the technology in a somewhat vague presentation with limited testimonials, many cloud customers were still lacking access. With today’s announcements—and a recent multibillion-dollar investment in AI startup Anthropic—Amazon is clearly positioning itself to make significant strides in the rapidly growing and lucrative generative AI market.