AWS ups its cost-visibility game with free-to-use offering for Kubernetes clusters
The new tool should help users make better cost-based decisions in their cloud environments


AWS has unveiled a new offering that will allow customers to allocate application costs to individual business units or teams based on the amount of resource being used by Kubernetes apps.
The tool, dubbed Split Cost Allocation Data, will provide “granular cost visibility” for the firm’s Elastic Kubernetes Service (EKS), enabling users to analyze and optimize Kubernetes applications, as well as chargeback cost and usage.
“You can’t simply allocate the cost of a resource, such as EC2 instance to a tag or label, because the EC2 instance may be running multiple containers, with each supporting a different application,” Shubir Kapoor and Mihir Surani said in a post last week. “The resources also may be attached to different cost centers around the organization.”
Citing a Cloud Native Computing Foundation (CNCF) report from 2023, AWS says the new capability will address the problem that almost 40% of businesses are being forced to estimate their Kubernetes costs, in lieu of effective data.
“Estimating how much resources your clusters will need is not an exact science,” Edith Puclla, technology evangelist at Percona, told ITPro. “The problem comes when you either try to apply a one size fits all policy for each container, which can be wasteful, or when you don’t go back and check that your initial assumptions were correct or not,” she added.
“Incorporating monitoring and automation tools can help dynamically adjust resources based on real time demand, reducing the necessity for constant manual adjustments."
Split Cost Allocation Data is available to customers in all AWS commercial regions, excluding China, for no additional cost, the company has confirmed.
Get the ITPro daily newsletter
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
Cost visibility at the pod level
With AWS’ new platform, costs can be distributed at the “Kubernetes pod level,” meaning costs are based more accurately on the actual consumption of the CPU and memory utilized by individual pods.
It also provides granular information at the container level, creating cost efficiency within specific containerized applications, and providing a way for users to chargeback to business entities.
Once enabled, the tool scans a customer’s entire suite of Kubernetes pods, noting attributes such as namespace, node, and cluster, as well as CPU and memory requests.
Alternatively, the tool can glean metrics from a user's Amazon Managed Service for Prometheus workspaces to assess pod-level costs - some of the metrics it could assess include split cost, unused cost, actual usage, and reserved usage.
“The visibility of EKS cost data provides insights to improve efficiency and cost optimization,” Kapoor and Surani said.
Other players in the Kubernetes cost game
AWS’ announcement comes at a time when other market players are beginning to offer similar cost analysis tools.
RELATED WHITEPAPER
KubeCost, an alternative Kubernetes cost-monitoring platform, was announced at Kubecost 2.0 earlier this year. The company, which aims to put open source “at the center” of its business model, unveiled a raft of new features in this announcement, including machine learning-based (ML) spending forecasts and automated workflows dubbed Kubecost Actions.
While both Kubecost 2.0 and AWS’ new cost allocation platform have comparable features, Kubecost does have the added benefit of integrating with other major cloud provider environments such as Microsoft’s Azure or Google Cloud Platform (GCP).

George Fitzmaurice is a former Staff Writer at ITPro and ChannelPro, with a particular interest in AI regulation, data legislation, and market development. After graduating from the University of Oxford with a degree in English Language and Literature, he undertook an internship at the New Statesman before starting at ITPro. Outside of the office, George is both an aspiring musician and an avid reader.
-
Bigger salaries, more burnout: Is the CISO role in crisis?
In-depth CISOs are more stressed than ever before – but why is this and what can be done?
By Kate O'Flaherty Published
-
Cheap cyber crime kits can be bought on the dark web for less than $25
News Research from NordVPN shows phishing kits are now widely available on the dark web and via messaging apps like Telegram, and are often selling for less than $25.
By Emma Woollacott Published
-
AWS eyes ‘flexible’ data center expansion with $11bn Georgia investment
News The hyperscaler says the infrastructure will power cloud computing and AI growth
By Nicole Kobie Published
-
Ireland has become a “data dumping ground” for big tech
The sharp increase in data center projects may be threatening Ireland's net-zero aspirations
By Nicole Kobie Published
-
Microsoft’s Three Mile Island deal is a big step toward matching data center energy demands, but it's not alone — AWS, Oracle, and now Google are all hot for nuclear power
News The Three Mile Island deal comes after concerns over Microsoft’s carbon emissions surge
By George Fitzmaurice Published
-
Data center carbon emissions are set to skyrocket by 2030, with hyperscalers producing 2.5 billion tons of carbon – and power hungry generative AI is the culprit
News New research shows data center carbon emissions could reach the billions of tons by the end of the decade, prompting serious environmental concerns
By George Fitzmaurice Published
-
The life sciences guide to AI-driven innovations
whitepaper Three steps to reinventing your business with data and AI
By ITPro Published
-
The generative AI advantage: A leader's guide
whitepaper The generative AI advantage: A leader's guide
By ITPro Last updated
-
Build a proactive security strategy
whitepaper Secure cloud best practices
By ITPro Published
-
Secure cloud best practices
whitepaper Secure cloud best practices
By ITPro Last updated