Anthem Changes Channel to Meet Data Storage Requirements
As Anthem Sports & Entertainment's video content grew, so did its data storage requirements — to the point where the broadcasting and production company was forced to make drastic changes. And now it is seeing a performance difference that has been like "night and day."
If you're a big sports or game show fan, chances are that you have encountered programming from Anthem Sports & Entertainment. The Toronto-based company owns 10 national, international and global specialty channels, including the Fight Network, IMPACT Wrestling, GameTV and HDNet Movies.
All of those channels create a lot of content — hundreds of terabytes' worth, according to Ryan Barnes, the company’s supervisor of technical implementations and maintenance. About 90% of Anthem’s data is video content, and the rest is graphic images and text documents.
As Anthem grew, its data storage requirements grew with it. The company picked up different storage technologies through the years, resulting in a disaggregated, confusing and sometimes unreliable mix that just wasn’t working well. It included an aging media server consisting of two mirrored RAID 6 arrays, JBOD media, external hard drives and even thumb drives. Anthem was even using LTO 6 tape for its for short-term storage, even though it is designed for archival storage.
At its breaking point, Anthem had more than 200 terabytes of data and was still running out of room. The media server alone had 160 usable terabytes, and it was always full. Another siloed unit had 30 terabytes of capacity, and it was nearly full. The company was using storage as fast as it could add more hard drives and optical media. It simply couldn’t stay on top of capacity needs.
Data also was unacceptably slow. The company’s media server was limited to about 10G of throughput “if we were lucky,” Barnes said — not nearly enough to handle the number of files that its automation required at that point. “We couldn’t stream data directly from our storage, move files in or out of the storage at a reasonable pace, or transfer files from one to another efficiently,” he said. “When you have eight editors on a Friday night trying to submit 100G files for the weekend through a 1G connection each, it was chaos. Transfer times that should have taken a few minutes would take the better part of an hour.”
Because there were so many different, unconnected data repositories, it also wasn’t unusual to end up with multiple copies of the same data. In addition to causing storage bloat, the copies weren’t always exactly the same.
“Because we had so many data silos, like USB hard drives on an employee’s desk or a USB stick stuck in a drawer, it was very hard to find the files you needed,” Barnes said. “We could never be 100% sure that we had the latest file to go to air with. … Even if you could find the file, you had no way of knowing whether it was actually the latest version of that file or an older version.”
Barnes’ team had finally reached its breaking point in 2019, when its media server was no longer supported by the vendor. “We knew that eventually something would break and we wouldn’t be able to replace it,” he said.
Night and Day
After having dealt with performance, scalability and reliability issues for years, Barnes wanted to go in a different direction to meet the company's data storage requirements. Reliability was key, ruling out anything but name brand storage arrays. Barnes also was committed to consolidating data silos, requiring a flexible, scalable system. In addition, he wanted a system that was as open as possible so Anthem wouldn’t be locked into any sort of ecosystem. It also had to meet the performance requirements Barnes knew were coming as the result of an upcoming expansion and consolidation of all channels from multiple locations to headquarters in Toronto.
Eventually, Barnes settled on a seven-node Qumulo file data platform optimized for media and entertainment. The data platform also included an API to automate provisioning and application integration. Instead of six large data silos, Anthem is down to three and will soon be down to two. The only outlier will be a small amount of corporate data storage.
The performance difference has been like “night and day,” Barnes said. “When our automation system was tied to our legacy storage, we kept running into problems around file access, files not being transferred properly, file corruption and slow playback. Now, I have yet to have a single file, performance or transfer issue.” The new system has freed up more than 500 person-hours annually on issues such as troubleshooting and patching.
Today, the platform stores 225 terabytes of raw data, consolidating data that was previously disaggregated. Because of the COVID-19 pandemic, Anthem staff are currently working from home. Editors can pull a video file and, after working on it, submit it back to Anthem’s network operations center (NOC), where a file transfer program will transfer it to Qumulo storage. When Anthem’s automation system identifies that new file has been placed in Qumulo’s temporary watch folder, an automated quality control process is triggered. When the check is complete, the file is moved to another segment within Qumulo for a quality check by a human. When files are ready to be used on air, they are streamed to a playout server directly from Qumulo. After playback, files move into the section of Qumulo that stores HD files.
Barnes’ team uses Qumulo’s real-time operational analytics to monitor storage trends within the company. If he sees that the system is on a path toward using more space than expected, it’s an easy fix. If an editor has problems connecting to storage, Barnes can use the analytics function to determine if the editor is actually connected or is having issues accessing a specific folder. “I can get right down to the file level they might be having issues with trying to access or do something with. Before it used to be guessing,” he said.
During the next few years, Barnes will be working hard to rebuild Anthem’s network operation center from the ground up. When that happens, he will be able to move the company’s last siloed storage array to Qumulo. As a result, he plans to double the company’s storage.
About the Author
You May Also Like