Intel Outlines Tech for Mobile, HPC, Big Data
September 20, 2012
The immense amount of information at the 2012 Intel Developer Forum (IDF) last week was infused with a consummate amount of geekiness, and encompasses a broad spectrum of research, markets and products. Here's a couple of highlights:
Wireless Opportunities
Intel CTO Justin Rattner provided the keynote for the last day of the event and highlighted several technology developments. Eliminating wires on laptops is a focus for the company, through WiGig (Wireless Gigabit Alliance), and Rosepoint, an experimental 32nm SoC (System on Chip) with WiFi transceiver and two Intel Atom cores on the same die. Highlighting the Cisco data about Mobile data and consumer Internet traffic exponentially increasing in the next several years, Rattner talked about content-aware video adaptation, and an industry/university research collaboration for Video Aware Wireless Networks (VAWN).
To support that mobile demand, Rattner discussed building the next generation wireless infrastructure. To address the challenges faced by traditional radio access networks and running IT infrastructure at cell tower sites, Intel talked about its research collaboration with China Mobile for a Cloud Radio Access Network (C-RAN). Instead of housing IT infrastructure at the cell tower site, a centralized cloud data center provides a resource pool for base stations that consolidates processing and can support many cell towers with a software defined radio. The benefit to China Mobiel is lower CAPEX and OPEX, faster system rollout, and significantly lower energy consumption when compared to the traditional RAN system. China Mobile's Chief Scientist Dr. Chih-Lin I said the company recorded 30 billion killowatt-hours last year, 70 percent of which was from base stations.
Big Data meets HPC
Big data is certainly a big topic now days and when combined with high performance computing power, the possibilities are exciting. Intel's Director of Technical Computing John Hengeveld hosted the IDF session Big Data Meets High Performance Computing.
Intel's definition of big data is a class of insight opportunities brought about by increased capabilities in data, and capitalized on by improved computational power. The shift in data occurs in volume (gigabytes to terabytes), velocity (occasional batch to complex event processing) and vareity (centralized, structured). HPC today is different than the past because it was a silo in the past, and today it is interconnected, agile and able to source data from many places.
One of the exiting forays into combining big data and HPC is amplab, whose vision is to integrate algorithms, machines and people to make sense of big data. Founded by Amazon, Google and SAP, and powered by Berkeley, the group has already established a big data architecture framework and several applications that they have released to open source. AMP Lab benefits from some real life data center workloads by analyzing the activity logs of real life, front line systems of up to 1000s of nodes servicing 100s of PB of data. Partner companies that have contributed this big data for Amp Lab to crunch include Google, Microsoft, Facebook, NetApp, Cloudera and Twitter.
HPC moves big data from batch orientation to real-time, mission critical, and helps deliver real time answers to real time data. It is also assisted by intelligence built into storage and Intel True Scale Fabric optimizing for big data on interconnects.
About the Author
You May Also Like