Technalysis Research
 
Previous Blogs

April 11, 2024
Google Integrates Gemini GenAI Into Workspace

March 26, 2024
Adobe Brings GenAI to Brands and Enterprise Creatives

March 19, 2024
Nvidia Advances GenAI Adoption

March 14, 2024
Arm and Cadence Push Software-Defined Vehicle Development Forward

February 29, 2024
Two Words That Are Critical to GenAI’s Future

February 20, 2024
Intel’s Gelsinger Describes a Different Kind of Foundry

February 1, 2024
How Will GenAI Impact Our Devices?

January 17, 2024
Samsung Focuses Galaxy S24 Upgrades on Software

2023 Blogs

2022 Blogs

2021 Blogs

2020 Blogs

2019 Blogs

2018 Blogs

2017 Blogs

2016 Blogs

2015 Blogs

2014 Blogs

2013 Blogs
















TECHnalysis Research Blogs
TECHnalysis Research president Bob O'Donnell publishes commentary on current tech industry trends every week at LinkedIn.com in the TECHnalysis Research Insights Newsletter and those blog entries are reposted here as well. In addition, those columns are also reprinted on Techspot and SeekingAlpha.

He also writes a regular column in the Tech section of USAToday.com and those columns are posted here. Some of the USAToday columns are also published on partner sites, such as MSN.

He also writes a 5G-focused column for Forbes that can be found here and that is archived here.

In addition, he also occasionally writes guest columns in various publications, including RCR Wireless, Fast Company and engadget. Those columns are reprinted here.

April 23, 2024
Amazon Web Services Expands Bedrock GenAI Service

By Bob O'Donnell

One of the many interesting aspects of the Generative AI phenomenon is the huge range and diversity of vendors that are offering solutions to leverage this impressive new technology. Companies eager to deploy GenAI have to wade through a morass of foundation model suppliers, AI platform companies, data management vendors, model customization tool purveyors and more.

Surprisingly, the big cloud computing companies—which have dominated the IT landscape for the last decade—haven’t played as central a role as many initially expected. At least, not yet. But there are certainly signs that situation could be changing. Google recently held an impressive Cloud Next event (see “Google Integrates Gemini GenAI Into Workspace” for more) where the company unveiled a wide range of GenAI-powered capabilities.

Now, Amazon’s AWS division is taking the wraps off a host of new features and improvements for its Bedrock GenAI fully managed service, all of which are designed to make the process of selecting and deploying the right tools for GenAI applications much easier. Specifically, Amazon is adding the ability to import customized foundation models into the service and then allow companies to leverage the capabilities of Bedrock across those custom models.

For example, companies that have trained an open-source model like Llama or Mistral with their own data—potentially with Amazon’s own SageMaker model development tool—can now integrate that customized model along with the existing standardized models within Bedrock. As a result, organizations can use a single API to build applications that tap into their customized models as well existing Bedrock model options including the latest from AI21 Labs, Anthropic, Cohere, Meta and Stability AI, as well as Amazon’s own Titan models. In fact, Amazon also introduced version 2 of their Titan Text Embeddings model, which has been specifically optimized for RAG (Retrieval Augmented Generation) applications. The company also announced the general availability of its Titan Image Generator model.

Speaking of RAG, one of the other benefits of custom model importing into Bedrock is the ability to leverage the service’s integrated RAG functions. This allows companies to leverage this increasingly popular new technology to continue fine tuning their custom models with new data. Because it’s serverless, Bedrock includes built-in features for seamlessly scaling the performance of models across AWS instances as well, allowing companies to more easily manage their real-time demands as the situation requires. In addition, for organizations looking into building AI-powered agents capable of performing multi-step tasks, Bedrock also features tools that allow developers to create them and lets businesses tap into their customized models as they do so. Agents are currently one of the hottest discussion topics in GenAI, so these kinds of capabilities are bound to be of interest to those organizations that want to stay on the cutting edge.

On top of these existing capabilities for Bedrock, Amazon announced two additional ones, both of which can be extended to existing Bedrock models and custom imported models as well. New Guardrails for Amazon Bedrock adds an additional set of filtering features to prevent the creation and release of inappropriate and harmful content as well as personally identifiable and/or sensitive information. As Amazon notes, virtually all models already incorporate some degree of content filtering, but the new Guardrails provide an additional layer of customizable prevention to help companies further protect themselves from these kinds of issues and ensure that content being generated conforms with the customer’s guidelines.

In addition, Amazon’s Model Evaluation tool within Bedrock is now generally available. This tool helps organizations find the best foundation model for the particular task they’re trying to achieve or application they’re trying to write. The evaluator compares standard characteristics such as accuracy and robustness of responses from different models. It also allows for customization of several key criteria. Companies can, for example, upload their own data or a set of custom prompts to the evaluator and then generate a report comparing how different models fared based on their customized needs. Amazon also offers a mechanism for letting humans evaluate different model output for subjective measurements such as brand voice, style, etc. This Model Evaluation is an important capability, because while many companies may initially be attracted to an open model platform like Bedrock thanks to the range of different choices it offers, those same choices can quickly become confusing and overwhelming.

We’re still in the early days of GenAI deployments and many organizations are just starting to figure out what their strategy and implementation approach is going to be. One point that’s become clear, however, is that many companies have begun to recognize how important it is to have their GenAI software and services near their data sources. Given how much data is located within the AWS cloud, a lot of those organizations are going to look kindly at these new enhanced AWS capabilities as a result. For companies that have a lot of data stored within AWS (for use in training or fine-tuning their GenAI models with technologies like RAG), these new Bedrock features could help entice them to jumpstart their GenAI application journeys.

We may also start to see the growth of multi GenAI platform deployments. Just as companies learned that using multiple cloud providers made the most economic, logistical and technical sense, so too are we likely to see them take a similar approach to leveraging multiple GenAI platforms for different types of applications. The race is still on, but it’s clear that all the major cloud computing providers want to be (and will be) important entrants as well.

Here’s a link to the original column: https://seekingalpha.com/article/4685492-amazon-web-services-expands-bedrock-genai-service

Bob O’Donnell is the president and chief analyst of TECHnalysis Research, LLC a market research firm that provides strategic consulting and market research services to the technology industry and professional financial community. You can follow him on LinkedIn at Bob O’Donnell or on Twitter @bobodtech.

Podcasts
Leveraging more than 10 years of award-winning, professional radio experience, TECHnalysis Research has a video-based podcast called TECHnalysis Talks.
LEARN MORE
  Research Offerings
TECHnalysis Research offers a wide range of research deliverables that you can read about here.
READ MORE