AWS targets easier AI adoption at AWS Summit London
Through its Bedrock and Amazon Q services, AWS has committed itself to AI accessibility at the enterprise level


AWS is shoring up its position in the AI landscape by setting its sights on making AI adoption easier for enterprises, by acting as both a supplier and a developer of enterprise AI tools.
The keynote talks at AWS Summit London made this mission clear, with executives from the company taking to the stage to tout AWS’ impressive AI-ready infrastructure.
At just an hour long, the keynote address was short but sweet, and the firm illuminated some of the ways it's speeding up the process of AI adoption.
“All these tools are about accelerating adoption of technology,” Tanuja Randery, VP and managing director of EMEA at AWS Europe, tells the event’s crowded auditorium.
How AWS is preparing itself to accelerate AI adoption can be understood through two of its key offerings - the foundational support of its AI model platform Amazon Bedrock and its enterprise-grade chatbot Amazon Q.
While the firm positions itself as the industry middleman with Bedrock, granting customers access to big-name models, it also displays its independence by showing off a powerful homegrown solution.
While there was little in the way of a fresh release at AWS Summit London, the audience certainly left with a better sense of the AWS products already making headway in the market.
Get the ITPro daily newsletter
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
Speaking to a packed auditorium – so packed that many attendees weren’t even able to make it inside for the keynote – AWS makes a subtle but effective show of strength, leveraging its immense infrastructure to show how readily it can capitalize on the tangible business interest in generative AI.
AWS wants to be the industry middleman
As the largest public cloud provider on the planet and a firm well-equipped to offer its customers powerful solutions, AWS could rely on simply reminding those in attendance of its own cloud prestige.
“We were a pioneer of the cloud, turning technologies like networking and storage, databases, and computing into programmable resources,” Francessca Vasquez, VP of professional services and the generative AI Innovation Center, reminds the audience.
It’s in this vein that Vasquez begins to talk about Amazon Bedrock, AWS’ managed service platform for AI which comes equipped with foundation models from the likes of Mistral and Meta.
The attraction of Bedrock is obvious. It makes the process of using generative AI far simpler for enterprises, as companies gain access to a huge range of AI models for them to choose from, depending on the use case.
RELATED WHITEPAPER
“It is the easiest way to build and scale generative AI applications with large language models (LLMs) and other foundation models,” Vasquez says
“Customers in virtually every single industry are using Amazon Bedrock to reinvent their user experiences, products, and processes,” she adds.
AWS has put a lot of effort into making Amazon Bedrock as appealing a package as possible. Since the platform was first unveiled AWS has emphasized customer choice and stressed the importance of providing a wide range of third-party LLMs including several high-profile open LLMs.
Amazon Bedrock includes models from the likes of Anthropic, into which AWS has invested $4 billion, including its new ChatGPT-challenging flagship Claude 3. Just before AWS Summit London the firm announced Meta’s Llama 3, the strongest open-source model on the market, has been added to Bedrock.
“We at AWS believe no one model will rule them all - we're still in the early days of generative AI, and these models will continue to evolve at unprecedented speed,” Vasquez says.
“That's why customers need the flexibility to use different models at different times,” she says.
Here, flexibility gives AWS the edge, allowing both the firm and its customers to hedge their bets in the generative AI race. Neither AWS nor those enterprises using it need to worry about committing to an existing AI startup.
AWS has also tapped into one of the fundamental pain points of generative AI in the enterprise, namely that leaders often don’t know how to implement it or which AI instances would work best for them. That’s why Bedrock is so attractive, as it offers a solution to that anxiety.
“In 2023, I saw a lot of companies really grappling with generative AI,” Vasquez says. “They could see the potential and began experimenting. But it's hard. It's hard to move experimental proof of concepts into production.”
“We can build services so that you can make this leap,” she adds, “we are making it easy for you to be able to build and scale generative AI”.
AWS shores up its own AI offerings
It’s important to remember that AWS was also very much showcasing its own generative AI chatbot, Amazon Q, which it celebrated in tandem with the offerings of Bedrock.
AWS’ approach, Vasquez explains, is to look at the three key layers of the generative AI stack, which includes the bottom layer of training infrastructure and the middle layer of Amazon Bedrock.
“At the top layer, we build applications by leveraging foundation models and LLMs so that you can take advantage of generative AI quickly and without any specialized knowledge,” Vasquez says.
This want of “specialized knowledge” is key, as the firm commits itself to breaking down this barrier to AI adoption. AWS’ customers want to reap the rewards of generative AI “quickly and easily”, Vasquez says, but “without the need for any machine learning (ML) expertise”.
Offering aid in “every single step of the development lifecycle”, Amazon Q is the firms answer to enterprise AI. It combines data analysis and natural language input in a single source of information that suits any level of AI expertise.
After illuminating some of the tool's possible use cases, Vasquez is also eager to mention just how well Amazon Q works with Bedrock, which allows generative AI to be added to any application through its unified application programming interface (API).
In the first instance, AWS is making AI more accessible through Bedrock. In the second, its attention to accessibility at the generative AI use level through Amazon Q.
How successful this steady approach to AI will be will rest entirely on its customers. Landing somewhere in between the focused, OpenAI-reliant approach of Microsoft and ‘AI everywhere’ approach of Google Cloud, AWS has fallen back on its established cloud dominance and wide model garden.

George Fitzmaurice is a former Staff Writer at ITPro and ChannelPro, with a particular interest in AI regulation, data legislation, and market development. After graduating from the University of Oxford with a degree in English Language and Literature, he undertook an internship at the New Statesman before starting at ITPro. Outside of the office, George is both an aspiring musician and an avid reader.
-
Bigger salaries, more burnout: Is the CISO role in crisis?
In-depth CISOs are more stressed than ever before – but why is this and what can be done?
By Kate O'Flaherty Published
-
Cheap cyber crime kits can be bought on the dark web for less than $25
News Research from NordVPN shows phishing kits are now widely available on the dark web and via messaging apps like Telegram, and are often selling for less than $25.
By Emma Woollacott Published
-
'Customers have been begging us to launch': AWS just rolled out Amazon Q Business in Europe – and it includes new data residency features
News AWS has announced the availability of its Amazon Q Business platform in Europe in a move sure to please sovereignty-conscious customers.
By George Fitzmaurice Published
-
AWS sharpens sustainability focus as AI environmental concerns rise
News The hyperscaler says sustainability plays a part in core decision-making in the age of AI
By George Fitzmaurice Published
-
AWS goes all in on AI agents with new features for Bedrock and Amazon Q
News Agentic customizability is coming to Bedrock and the Amazon Q developer assistant
By George Fitzmaurice Published
-
New AWS and Box collaboration brings AI models to enterprise content
News Box customers can now access Anthropic’s Claude and Amazon Titan foundation models within Box AI
By Daniel Todd Published
-
Databricks expands AWS partnership to drive generative AI capabilities
News The new agreement promises “unmatched scale and price performance” to help customers take genAI applications to market faster
By Daniel Todd Published
-
Amazon’s $4 billion investment in Anthropic faces UK competition probe – here’s what it means
News The CMA investigation into the Anthropic investment is the latest in a slew of probes by the competition regulator
By Emma Woollacott Published
-
Hyperscaler AI spending is getting out of control — and Microsoft says it could take 15 years for it to make good on investments
News Tech giants' results show billions being poured into AI infrastructure, but big leaps in revenue remain elusive
By Nicole Kobie Published
-
ChatGPT could be facing some serious competition: Amazon is reportedly working on a new AI tool, ‘Metis’, to challenge the chatbot’s dominance
News Amazon could be preparing to mount a serious challenge on ChatGPT’s dominance with the launch of a new chatbot service
By Ross Kelly Published