BBC: Making use of the cloud to transcode video and reduce costs

BBC studio

The BBC’s iPlayer service has become a firm favourite since it was launched in 2007. In that time, the service has had to evolve as it has become more in demand. It is now the most popular video-on-demand service in the UK. Indeed, iPlayer served up 36.5 billion minutes of content with help from the cloud in 2012 alone.

One of the most notable changes during this time has been the increase in viewers using tablets and smartphones to consume available media. Add to this, special events such as Glastonbury and the Olympics, and the service soon began to run into problems using traditional infrastructure.

The organisation was using a ground-based hardware system that was put together around five years ago, long before the rise in popularity of mobile devices, according to Stephen Godwin, senior technical architect at the BBC.

“That system was built at a time we weren’t [focused on] smartphones or tablets. We were constantly running into limitations of the system as we added these new things in,” he says.

He says that the old system would run out of disk storage and run out of I/O on the disk or on the network. “Those are really difficult things to upgrade and really difficult limits to remove,” Godwin adds.

The cloud gave the BBC an opportunity to scale up to meet demand as well as being able to effectively and efficiently serve these new form factors. While scalability was a key factor, it was also important to have transparency around the costs involved so that the organisation could anticipate exactly how much it would have to pay to serve greater volumes of content.

In late 2012, Godwin’s team embarked on a project that looked to cloud computing to help it achieve its aims. The project, dubbed Video Factory, focused on the cloud as it was believed it could provide both the levels of scalability and reliability that would enable it to handle spikes in content loading and demand without issue.

“The transcode infrastructure became the bottle neck for what we could make available online. We were making decision about what we made available online based on the transcode system which is the wrong way around,” Godwin says.

“The old system has single-points-of-failure. We wanted to move to a model where we had the resiliency of the broadcast chain.”

With these goals in mind, the BBC kicked off the project with an 18-strong development team. The project used Amazon EC2 instances as well as the SQS service and S3, the latter of which is mainly used for media storage. Elemental Cloud was also deployed for video transcoding in the project which took less than a year to complete. The old system was switched off in September 2013.

Video Factory is comprised of four major components: Mezzanine, Time-Addressable Media Store, Playout Data, and Transcoding.

The Mezzanine Capture component uses a broadcast-grade encoder to take in 24 channels. This is passed through something called a “chunker” to create 80MB files. Content is stored at 10Mbps for standard definition broadcasts and 30Mbps for HD.

The Time-Addressable Media Store pulls these chunks together to create a file that covers an entire programme. This is all done within the S3 infrastructure. It takes less than a minute to assemble a full programme, according to Godwin.

The Playout Data integrates with the playout system so the project heads can see what is being broadcast. This means programmes played via iPlayer start and end at the right times using time codes from broadcasts.

The Transcoding module uses profiles so that the system can send the right streams to the right devices. While it uses Elemental Cloud to transcode streams in the cloud, the system can also use transcoders “on the ground” as well. Godwin says that using transcoding in the cloud is handy when BBC One splits into, in effect, 19 separate TV stations to broadcast local news.

“This news hour is a perfect pattern for elastic architecture,” says Godwin, who added that previously it would take eight to 10 hours to process the data. Now, that process takes around 20 minutes, meaning post-news programmes are generally available much more quickly.

“It is an ideal scenario for using burst capacity in the cloud,” he says.

In total there are about 20 components in Video Factory and it is completely message-driven using Amazon’s Simple Queueing Service (SQS).

“Message-oriented architectures have been around more than ten years,” says Godwin. “And it’s very well understood.”

The project was broken down into chunks and the Transcoding abstraction layer was deployed first. “Glastonbury was the first big test, with 170 odd hours of video made available,” Godwin says. Next came the Mezzanine systems, which acted as a safety net to capture anything broadcast by the BBC.

The benefits of the new system mean the BBC can now port more content online and more quickly deliver live broadcasts. The system also allows iPlayer Premieres - episodes of programmes that go online up to seven days before they are broadcast.

Glastonbury also produced far more content than the BBC could ever broadcast, so a lot of this video also went online for people to download and watch.

"We now offer iPlayer Exclusive, offering some Glastonbury content only online," says Godwin. "The new system gets out of the way and we can put as much video and content through as we’d like. It scales not only in technology terms but also in price."

Going forward, the BBC is mulling over the idea of integrating Video Factory with its simulcast chain. This would allow the broadcaster to stream BBC TV channels at the same time to both TVs and devices.

Rene Millman

Rene Millman is a freelance writer and broadcaster who covers cybersecurity, AI, IoT, and the cloud. He also works as a contributing analyst at GigaOm and has previously worked as an analyst for Gartner covering the infrastructure market. He has made numerous television appearances to give his views and expertise on technology trends and companies that affect and shape our lives. You can follow Rene Millman on Twitter.

Latest in IaaS
Whitepaper cover with title and text below an image of smiling colleagues looking at a laptop
Modern IT experiences with predictable costs
Whitepaper cover with title and right side green image of a split flap board
The total economic impact™ of Dell PC as a Service
Whitepaper cover with logo on black band with turquoise dotted line patterns and title and text below
HCI adoption is driven by a need to bring cloud operations on-premises and accelerate innovation
Whitepaper cover with two colleagues looking at papers with a laptop in front of them
Harness the power of technology to drive your business forward
Several groups of orange files connected on a concept of a circuit board
What is AWS EC2?
A photo of the outside of a conference centre with the words VMware Explore displayed on a sign
VMware and Equinix combine cloud with bare metal in new distributed service
Latest in Feature
Matt Clifford speaking at Treasury Connect conference in 2023
Who is Matt Clifford?
Open source vulnerabilities concept image showing HTML code on a computer screen.
Open source risks threaten all business users – it’s clear we must get a better understanding of open source software
An abstract CGI image of a large green cuboid being broken in half with yellow, orange, and red cubes to represent ransomware resilience and data encryption.
Building ransomware resilience to avoid paying out
The words "How effective are AI agents?" set against a dark blue background bearing the silhouettes of flowchart rectangles and diamonds to represent the computation and decisions made by AI agents. The words "AI agents" are yellow, while the others are white. The ITPro Podcast logo is in the bottom right-hand corner.
How effective are AI agents?
An illustration showing a mouth with speech bubbles and question marks and a stylized robot alien representing an AI assistant chirping away with symbols and ticks, to represent user annoyance with AI assistants.
On-device AI assistants are meant to be helpful – why do I find them so annoying?
A range of HP devices set on pedestals on the keynote stage at HP Amplify 2025 in Nashville, with a large screen in the background bearing the HP logo against a white background. The devices include AI PCs, laptops, and printers.
HP hones its edge AI ambitions at Amplify 2025