Google Cloud aims to remove data limits through BigLake
Tech giant also announced the formation of the Data Cloud Alliance initiative to help global businesses undergo digital transformation


Google Cloud has introduced a preview of BigLake today, a data lake storage engine, it said would remove data limits by unifying data lakes and warehouses.
The tech giant said that managing data across disparate lakes and warehouses create silos and increases risk and cost, especially when data needs to be moved. Through BigLake, it hopes to allow companies to unify their data warehouses and lakes to analyse data without worrying about the underlying storage format or system. This, it said, eliminates the need to duplicate or move data from a source and reduces cost and inefficiencies.
Google Cloud customers will gain access controls through BigLake, with an API interface spanning Google Cloud and open file formats like Parquet, as well as open-source processing engines like Apache Spark. The company said these capabilities extend a decade’s worth of innovations with BigQuery to data lakes on Google Cloud Storage to enable a flexible and cost-effective open lake house architecture.
BigLake is set to be the centre of the company’s overall strategy from a storage perspective, revealed Sudhir Hasbe, Google Cloud’s data analytics product manager.
RELATED RESOURCE
Automating the modern data warehouse
Freedom from constraints on your data
FREE DOWNLOAD
“We are going to make sure all of our tools and capabilities work seamlessly with BigLake,” he said. “Similarly, all of our analytics engines, whether it's BigQuery, whether it's our Spark engine, or whether it's our data flow engine, all of these engines will seamlessly integrate out of the box with BigLake.”
The company also announced the formation of the Data Cloud Alliance, a new initiative to ensure global businesses have more seamless access and insights into the data required for digital transformation.
The alliance comprises Google Cloud, Accenture, Confluent, Databricks, Dataiku, Deloitte, Elastic, Fivetran, MongoDB, Neo4j, Redis, and Starburst. The companies said they are committed to making data more portable and accessible across disparate business systems, platforms, and environments.
Get the ITPro daily newsletter
Sign up today and you will receive a free copy of our Future Focus 2025 report - the leading guidance on AI, cybersecurity and other IT challenges as per 700+ senior executives
Google Cloud said that the proliferation of data means businesses increasingly need common digital data standards and a commitment to open data, to effectively use data to digitally transform.
The Data Cloud Alliance is committed to accelerating the adoption of data analytics, artificial intelligence, and machine learning best practices across industries through common industry data models, open standards, and integrated processes.
The alliance’s members will work together to help reduce customer challenges and complexity with data governance, data privacy, data loss prevention, and global compliance. They will provide infrastructure, APIs, and integration support to ensure data portability and accessibility between multiple platforms and products across multiple environments.
Each alliance member is also set to collaborate on new, common industry data models, processes, and platform integrations to increase data portability and reduce the complexity associated with data governance and global compliance.
Zach Marzouk is a former ITPro, CloudPro, and ChannelPro staff writer, covering topics like security, privacy, worker rights, and startups, primarily in the Asia Pacific and the US regions. Zach joined ITPro in 2017 where he was introduced to the world of B2B technology as a junior staff writer, before he returned to Argentina in 2018, working in communications and as a copywriter. In 2021, he made his way back to ITPro as a staff writer during the pandemic, before joining the world of freelance in 2022.
-
Should AI PCs be part of your next hardware refresh?
AI PCs are fast becoming a business staple and a surefire way to future-proof your business
By Bobby Hellard Published
-
Westcon-Comstor and Vectra AI launch brace of new channel initiatives
News Westcon-Comstor and Vectra AI have announced the launch of two new channel growth initiatives focused on the managed security service provider (MSSP) space and AWS Marketplace.
By Daniel Todd Published
-
Empowering enterprises with AI: Entering the era of choice
whitepaper How High Performance Computing (HPC) is making great ideas greater, bringing out their boundless potential, and driving innovation forward
By ITPro Last updated
-
The CEO's guide to generative AI: Be a creator, not a consumer
Whitepaper Innovate your business model with modern IT architecture, and the principles of trustworthy AI
By ITPro Published
-
Learning and operating Presto
whitepaper Meet your team’s warehouse and lakehouse infrastructure needs
By ITPro Published
-
Scale AI workloads: An open data lakehouse approach
whitepaper Combine the advantages of data warehouses and data lakes within a new managed cloud service
By ITPro Published
-
Managing data for AI and analytics at scale with an Open Data Lakehouse approach
whitepaper Discover a fit-for-purpose data store to scale AI workloads
By ITPro Published
-
The power of AI & automation: Productivity and agility
whitepaper To perform at its peak, automation requires incessant data from across the organization and partner ecosystem
By ITPro Published
-
A guide to help you choose the UPS battery backup for your needs
Whitepaper Download this guide and stay connected with a UPS that's free of interruption or disturbance
By ITPro Published
-
Managing data for AI and analytics at scale with an open data lakehouse approach: IBM watsonx.data
whitepaper Eliminate information silos that are difficult to integrate
By ITPro Published