Software-Defined Storage: What Does It Really Mean?

Looking to control your storage infrastructure? Want to expand your cloud and data center platform? Find out how software-defined storage can help.

Bill Kleyman

March 5, 2014

5 Min Read
Software-Defined Storage: What Does It Really Mean?

We’ve covered the concept of software-defined technologies (SDx) and have shown how this very real technology can help your data center truly expand. If you haven’t seen our recent SDx Guide, make sure to take a look because there are powerful solutions that directly impact how your organization controls and distributes data.

With that in mind, software-defined storage has begun to make an interesting impact in the data center and cloud world. Already, SDx helps bring data centers closer together, but what can it really do for the storage component?

Let’s take a look at the big picture for a second. The model of the traditional data center, dating back a few years, heavily revolved around the physical infrastructure. We didn’t have virtualization or the concept of the cloud as we know it today. With that, we began to have hardware sprawl issues around servers, racks, and other equipment. Virtualization helped sort that out.

Still, growth around resource demands continued. This expanded to other pieces inside of the data center – specifically storage. Just how many more disks could you buy? How many more physical controllers would you really need to handle an influx of cloud, virtualization and users? At some point, the logical layer would have to be introduced to help the storage component better operate.

And so software-defined storage began to emerge. The idea here isn’t to take away from the storage controller, rather, it’s to help direct data traffic much more efficiently at the virtual layer. The power really kicks in because software-defined storage creates a much more agnostic platform to work with. So what does the technology look like?

SoftwareDefinedStorage

SoftwareDefinedStorage

Got the visual? Now let’s break it down.

Logical Storage Abstraction: Basically, you’re placing a powerful virtual layer between data request and the physical storage component. This layer allows you to manipulate how and where data is distributed. The great part here is that you’re still able to keep a heterogeneous storage infrastructure while still controlling the entire process from a virtual instance. You can present as many storage repositories to the software-defined storage layer and allow that instance to control data flow.

Intelligent Storage Optimization: Just because you have a logical storage control layer doesn’t mean you can’t still utilize the efficiencies of your existing storage. The software-defined storage component helps you push information to a specific type of repository. You’re able to control performance and capacity pools and further deliver information to the appropriate storage-type. However, your actual controllers can still help with thin-provisioning, deduplication, and more. The power is in the flexibility of this solution. You can present an entire array to the software-defined layer, or just a shelf or two.

Creating a more powerful storage platform: This hybrid storage model allows you to leverage the power of your physical infrastructure as well as your virtual. You’re able to create one logical control layer that helps you manage all of the physical storage points in your data center. This helps with storage diversification and helps prevent vendor lock-in. Logical storage abstraction also helps with migrating and moving data between storage arrays and between various underlying resources.

Regain control of your storage infrastructure: Storage sprawl was actually becoming a bit of an issue. Just how many more shelves or disks will you need to help support your next-generation data center? The answer doesn’t always revolve around the physical. Software-defined storage allows organizations to better manage storage arrays, disks and repositories. This means understanding where IO-intensive data should reside and how to best optimize information delivery. Not only are you creating a more powerful data control infrastructure, you’re also able to save on purchasing additional disks and/or storage components.

Begin to expand your cloud (and storage): A big part of software-defined storage is the capability to expand. In that sense, expansion doesn’t just happen within the data center – these technologies can help you scale your storage into the cloud and beyond. Software-defined storage can create powerful links to other distributed data centers for replication, DR and even storage load-balancing. The really great piece here is that, ultimately, storage replication will happen at the virtual layer between heterogeneous storage components .The translation happens completely at the logical layer.

At this point, we’re beginning to leverage the power of a storage controller and coupling it with the logical storage control layer. This means that organizations can look at what resources they have now, pool them, and distribute them on a truly global scale. With the amount of data being created every year, there had to become a better way to control the flow of this critical information. There is a lot more valuable data being passed through the cloud. Mobility, cloud computing IT consumerization have all evolved how we process and compute on a daily basis. Consider this, the latest Cisco Visual Networking Index report indicates:

  • Monthly global mobile data traffic will surpass 15 exabytes by 2018.

  • The number of mobile-connected devices will exceed the world’s population by 2014.

  • Due to increased usage on smartphones, smartphones will reach 66 percent of mobile data traffic by 2018.

  • Monthly mobile tablet traffic will surpass 2.5 exabyte per month by 2018.

Instead of buying more disks or getting locked down by your storage infrastructure, why not abstract the whole thing and begin to utilize what you have much more efficiently? Just like any other physical component in your infrastructure, the virtual layer can really help optimize utilization and resource consumption. Server virtualization helped with server sprawl, application virtualization helped with new types of virtual application delivery methodologies, and now virtual storage technologies are helping control your heterogeneous storage infrastructure.

The future cloud and data center model clearly indicate growth in the amount of data that will pass through your infrastructure. With that in mind, take a look at how software-defined technologies can help you optimize your existing resources to better deliver next-generation content.

About the Author

Bill Kleyman

Bill Kleyman has more than 15 years of experience in enterprise technology. He also enjoys writing, blogging, and educating colleagues about tech. His published and referenced work can be found on Data Center Knowledge, AFCOM, ITPro Today, InformationWeek, NetworkComputing, TechTarget, DarkReading, Forbes, CBS Interactive, Slashdot, and more.

Subscribe to the Data Center Knowledge Newsletter
Get analysis and expert insight on the latest in data center business and technology delivered to your inbox daily.

You May Also Like