How RBC Built Its Own GPU Farm for an Artificial Intelligence-Powered Banking Platform
Its vendors, Nvidia and Red Hat, expect lessons from the collaborative project will benefit the broader fintech space, as well as other industries.
August 10, 2020
The financial services sector's cautious nature has given it a reputation for being behind the curve on tech. As many businesses are taking advantage of the latest cloud-native technologies, banks are often depicted as still being dependent on old-fashioned monolithic apps running entirely in on-premises data centers.
That's not so much the case anymore, as the fintech space continues growing and big banks rush to catch up with the times. Some, most famously Capital One, have moved to the public cloud entirely, and many have been running their own private clouds while taking advantage of all the modern trappings, such as containers, microservices, and service meshes. Also growing in fintech is machine learning, a category of artificial intelligence technology.
Royal Bank of Canada, the country's largest financial institution, had been running a private OpenShift-based cloud on premises, in its own data center, for a while, before launching Borealis AI, a fintech-focused AI research and development center in 2016. This July, the company announced that it had worked with Red Hat, Nvidia, and others to create a new AI computing platform. The platform's machine learning infrastructure is powered by an Nvidia GPU farm. All the hardware was designed in-house.
In an interview with Data Center Knowledge, Mike Tardif, RBC's senior VP of global technology infrastructure, said that initially the company began deploying GPUs for its machine learning efforts in off-the-shelf GPU servers, but eventually realized that approach didn't offer enough flexibility and decided to start buying chips for AI computing directly from Nvidia.
"We felt leaving out the middle person a little bit, going directly there, made sense," he said. "Then we could start keeping up on where they were going with their chips, and what they're building for automation and software."
Foteini Agrafioti, RBC's chief science officer who heads Borealis AI, told us that the AI platform was designed to transform the customer banking experience and help keep pace with rapid technology changes and evolving customer expectations.
"What's amazing with this new technology is that we can now process things extremely fast," she said. "When we're analyzing client records on our personal-banking side, which is our largest business with millions and millions of client records, you can perform an analysis of a model within 20 minutes or an hour of an entire client base. It would take us weeks to do that using CPUs."
She said the applications they're developing are wide ranging, but pointed to a natural language processing project that performs real-time analysis of text from news articles and blogs as they become available to derive relevant insights for the company's analysts and financial advisors. Not only must the software cut through the noise of the internet, she said, it must determine what would be useful to the likes of equity research analysts and advisors who help clients stay on top of their portfolios.
The AI platform is harnessed for more traditional banking uses as well, such as fraud detection, data analytics, and day-to-day banking.
While the value of this new GPU farm and the fintech AI platform it powers is enormous to RBC, partners expect the project will create even more value for the enterprise tech commmunity at large.
"I think it's been a win, win, win," Tushar Katarki, senior manager of OpenShift product management at Red Hat, told DCK. "I mean, Red Hat has certainly benefited."
Especially important for Red Hat, he said, was getting a look at fintech-specific issues and challenges, like security and compliance concerns, that needed to be addressed.
"Because guess what? If it is applicable for RBC, it's probably applicable for other banks too, because they're all regulated in similar ways by the various governments," he said.
He added that the collaboration required between Red Hat and Nvidia was also beneficial to both companies and their customers, and said that the two companies learned from each other in the process.
"When we started thinking about AI as a workload, Red Hat didn't have a whole lot of exposure to AI and machine learning, and even HPC for that matter, whereas Nvidia had been doing it for many, many more years," he said. "So we learned that business and that technology, whereas they learned containers, CI/CD, Kubernetes, cloud, and everything from us, as they didn't have that kind of background. There was a very symbiotic relationship that continues to thrive even today and going into the future."
Katarki pointed out that the expertise in AI and ML that Red Hat and Nvidia gained from this project will benefit industries outside fintech as well, starting with insurance, but eventually also retail, government, manufacturing, and others.
"Now, we have so many dozens and dozens of customers using OpenShift as an open AI platform with GPUs," he said. "It's just great to see that this work has paid off and continues to do so. This is really early-stages for everybody anyway, so we have a long way to go, but it's very encouraging."
About the Author
You May Also Like