Optimizing Unused Data Center Capacity

Open source lets data operators buy and sell compute resources.

Soni Brown, Contributor, Data Center Knowledge

February 14, 2023

4 Min Read
Interior photograph of a data center room.
Greg Blomberg / Alamy Stock Photo

With demand for the public cloud increasing, one company is banking on its open-sourced, decentralized “super” cloud to provide more server space for data center operators. 

Akash Network’s cloud computing marketplace allows data centers and anyone with a computer to become a cloud provider by offering unused compute cycles. Data center leaders are contending with sourcing critical chip materials which are expected to increase cloud compute costs – up to 30 percent this year. A decentralized marketplace for trading excess compute seems to be an effective backup for costs and any shocks to the GPU supply chain.  

For operators looking to accelerate scale, efficiency, hyper-convergence, and price performance, trading cloud-grade compute can help mitigate their capital and operational expenditures. Use of the marketplace is also a perfect right-sizing strategy and part of a wider trend to integrate “on-premises, co-location, cloud, and the edge” according to a Gartner report.

Data center operators who’ve made steep investments in private clouds may want to look at other approaches before migrating to a decentralized system. 

“Although the cloud revolutionized the way computing resources are provisioned and consumed, it is not a panacea for all IT issues,” writes Sabo Tyler Diab, co-founder of Krystal Eight. “The public cloud is not a monolithic creature. Many organizations distribute their compute loads between cloud vendors – a model known as multi-cloud – in order to avoid “cloud lock-in” and to take advantage of best-of-breed services.”

Related:2023: These Are the World’s 12 Largest Hyperscalers

Akash Network CEO Greg Osuri says running open-source software developed by a community creates a competitive marketplace that lowers cost and increases commute availability.

“If you are a data center, more often than not, your space is underutilized,” Osuri says to Data Center Knowledge. “Intuit has TurboTax and once a year they're super busy during the tax season. You're essentially burning cash at that point because there's a three-year shelf-life for a lot of servers.”

Open Source Allows for Cheaper Data Center Operational Costs

Cloud-native and containerized applications running on a centralized cloud can run faster and at a lower cost on the decentralized cloud due to open source software. This benefits both developers and users of decentralized applications. For developers, it reduces the barriers to market entry and makes it easier to launch new applications. For users, it provides greater choice and flexibility in how they use applications. Having a community develop nodes encourages competition while shifting computing resources owned and operated by the three large cloud companies – Amazon, Google, and Microsoft.

Related:Data Center Site Selection: Global Hot Spots in 2023

“It’s 75 to 85 percent cheaper than the Amazons of the world,” said Osuri as the marketplace gives data center operators control over the pricing and amenities through reverse auctioning. Users, or tenants as Akash calls them, state their application needs and maximum budget. Providers bid on the order with the lowest bid winning. 

Since the tenant sets what they pay, Osuri believes it becomes a key element for innovation. Osuri says, “It's like pricing where you can only price for the resources that you want, unlike your traditional cloud where you have to buy an entire instance.”

Adding to the cost concern is simple economics: more supply than demand. Osuri says there were 8.4 million data centers in 2015 compared to the approximately 7.1 million data centers that currently exist. “A big reason for that is (investors and operators) need to justify the cost for existence.” Osuri added that linking idle capacity from data centers “lets you have your data center and recoup some of the investments made in your data center.”

The Hyper-Converged Approach

The decentralized and permissionless nature is attractive to a lot of these protocols that need hyper-convergence. Though Akash is geared towards cloud-native applications where people use a container for deployment, the public cloud infrastructure extends backup optionality beyond disk-based solutions. The supercloud model also provides a homogenous composable layer to a heterogenous resource set by abstracting critical functionalities to deliver production-grade applications.  

The way forward is a decentralized aspect that gives virtual desktops and new application development environments networking and storage resources from the physical hardware. Colos like Equinix Metal have been early supporters of Akash Network alongside Polygon and Solana.

About the Author

Soni Brown

Contributor, Data Center Knowledge

Soni Brown is a freelance content writer for Data Center Knowledge. In her writing and reporting roles she creates compelling content for stakeholders in the data center industry. She can be found on here: @neonscrawl.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like