BBC: Making use of the cloud to transcode video and reduce costs
Broadcaster uses cloud to aid flexibility and boost Winter Olympics coverage
The BBC’s iPlayer service has become a firm favourite since it was launched in 2007. In that time, the service has had to evolve as it has become more in demand. It is now the most popular video-on-demand service in the UK. Indeed, iPlayer served up 36.5 billion minutes of content with help from the cloud in 2012 alone.
One of the most notable changes during this time has been the increase in viewers using tablets and smartphones to consume available media. Add to this, special events such as Glastonbury and the Olympics, and the service soon began to run into problems using traditional infrastructure.
The organisation was using a ground-based hardware system that was put together around five years ago, long before the rise in popularity of mobile devices, according to Stephen Godwin, senior technical architect at the BBC.
“That system was built at a time we weren’t [focused on] smartphones or tablets. We were constantly running into limitations of the system as we added these new things in,” he says.
He says that the old system would run out of disk storage and run out of I/O on the disk or on the network. “Those are really difficult things to upgrade and really difficult limits to remove,” Godwin adds.
The cloud gave the BBC an opportunity to scale up to meet demand as well as being able to effectively and efficiently serve these new form factors. While scalability was a key factor, it was also important to have transparency around the costs involved so that the organisation could anticipate exactly how much it would have to pay to serve greater volumes of content.
In late 2012, Godwin’s team embarked on a project that looked to cloud computing to help it achieve its aims. The project, dubbed Video Factory, focused on the cloud as it was believed it could provide both the levels of scalability and reliability that would enable it to handle spikes in content loading and demand without issue.
Cloud Pro Newsletter
Stay up to date with the latest news and analysis from the world of cloud computing with our twice-weekly newsletter
“The transcode infrastructure became the bottle neck for what we could make available online. We were making decision about what we made available online based on the transcode system which is the wrong way around,” Godwin says.
“The old system has single-points-of-failure. We wanted to move to a model where we had the resiliency of the broadcast chain.”
With these goals in mind, the BBC kicked off the project with an 18-strong development team. The project used Amazon EC2 instances as well as the SQS service and S3, the latter of which is mainly used for media storage. Elemental Cloud was also deployed for video transcoding in the project which took less than a year to complete. The old system was switched off in September 2013.
Video Factory is comprised of four major components: Mezzanine, Time-Addressable Media Store, Playout Data, and Transcoding.
The Mezzanine Capture component uses a broadcast-grade encoder to take in 24 channels. This is passed through something called a “chunker” to create 80MB files. Content is stored at 10Mbps for standard definition broadcasts and 30Mbps for HD.
The Time-Addressable Media Store pulls these chunks together to create a file that covers an entire programme. This is all done within the S3 infrastructure. It takes less than a minute to assemble a full programme, according to Godwin.
The Playout Data integrates with the playout system so the project heads can see what is being broadcast. This means programmes played via iPlayer start and end at the right times using time codes from broadcasts.
The Transcoding module uses profiles so that the system can send the right streams to the right devices. While it uses Elemental Cloud to transcode streams in the cloud, the system can also use transcoders “on the ground” as well. Godwin says that using transcoding in the cloud is handy when BBC One splits into, in effect, 19 separate TV stations to broadcast local news.
“This news hour is a perfect pattern for elastic architecture,” says Godwin, who added that previously it would take eight to 10 hours to process the data. Now, that process takes around 20 minutes, meaning post-news programmes are generally available much more quickly.
“It is an ideal scenario for using burst capacity in the cloud,” he says.
In total there are about 20 components in Video Factory and it is completely message-driven using Amazon’s Simple Queueing Service (SQS).
“Message-oriented architectures have been around more than ten years,” says Godwin. “And it’s very well understood.”
The project was broken down into chunks and the Transcoding abstraction layer was deployed first. “Glastonbury was the first big test, with 170 odd hours of video made available,” Godwin says. Next came the Mezzanine systems, which acted as a safety net to capture anything broadcast by the BBC.
The benefits of the new system mean the BBC can now port more content online and more quickly deliver live broadcasts. The system also allows iPlayer Premieres - episodes of programmes that go online up to seven days before they are broadcast.
Glastonbury also produced far more content than the BBC could ever broadcast, so a lot of this video also went online for people to download and watch.
"We now offer iPlayer Exclusive, offering some Glastonbury content only online," says Godwin. "The new system gets out of the way and we can put as much video and content through as we’d like. It scales not only in technology terms but also in price."
Going forward, the BBC is mulling over the idea of integrating Video Factory with its simulcast chain. This would allow the broadcaster to stream BBC TV channels at the same time to both TVs and devices.
Rene Millman is a freelance writer and broadcaster who covers cybersecurity, AI, IoT, and the cloud. He also works as a contributing analyst at GigaOm and has previously worked as an analyst for Gartner covering the infrastructure market. He has made numerous television appearances to give his views and expertise on technology trends and companies that affect and shape our lives. You can follow Rene Millman on Twitter.