Hyper-Converged and Software-Defined Storage Focus on Hardware

Friday, October 7, 2016 by Kathrin Brekle

Modern information technology in business is subject to constant change. Not only the growing flood of data, but also the increasing demands of fast access to information and instant communication leads to new technical innovations to cope with the large amounts of data. In this article we want to give you an overview of key trends such as Software Defined Storage (SDS) and Hyper-converged storage, as well as an overview of the challenges companies face in everyday storage management.

The end of the focus on hardware?

Not too long ago data storage solutions seemed to focus solely on the hardware. Arguments targeted performance and capacity, and often resulted in the acquisition of faster, better - or just more - hardware. The world’s volume of data is increasing exponentially, predicted to be up to 50,000 exabytes (1 exabyte = 1 million terabytes) in 2020. The generation of the data is from billions of new devices, permanently using an internet connection. All of this data must have proper storage and management. The hardware focused approach caused extreme binding to the manufacturer and large storage silos. Additionally, storage management staff requires special training on hardware focused storage and with the availability of virtualization and the cloud it is no longer the only suitable option.

What exactly is the concept of Software-Defined Storage?

The storing of data on a medium is a software function in itself. The hardware for data storage is actually secondary and mostly dependent on the choice of media, servers, operating systems and the manufacturers. The approach of SDS to regard software separately of the hardware functionality is a natural and evolutionary development of the conventional storage architecture.

The main advantage of SDS is to simplify the deployment and use of storage resources. Concerns about physical LUNs, ports or port addresses could belong to the past. In a virtual storage infrastructure, the complexity that lies behind the provisioning of storage volume in a specific size, is not what others come to understand.

In companies, there is nowadays a "do-more-with-less" mentality, which is why SDS is erroneously a solution, where only a small amount of expert or administrator knowledge is necessary to increase performance while saving money. Virtual Server administrators often have less knowledge about hardware technologies, but are responsible for ensuring that the right storage resources are available to applications and data. SDS could imply that explicit skills for providing storage resources are no longer necessary.

Outsourcing?

However, this is a dangerous assumption! Outsourcing of admin expertise increases the dependence on hardware vendors especially when special configurations are a requirement, any problems occur or components fail - and the own employees are not trained sufficiently. The responsibility of the physical infrastructure should not entirely lie in the hands of external providers. Companies often make the mistake of updating their infrastructure on the one hand, but limiting the ability of their own employees, to manage the storages, they built themselves. This fundamental problem is not solvable with SDS, it simply offers a cleaner and friendlier user interface.

SDS Infrastructure: What's it all about?

An important argument for the use of SDS however is the improvement of storage resources` agility. When virtual data is transferred from one server host to another, the links to the back-end storage are renewed automatically, a re-hosting is not required anymore. The ultimate goal of Software Defined Storage thus is the separation of the control over storage level and the hardware level, so resources can be easily provided for users and applications.

To receive the full benefit of SDS advantages, it is advisable to acquire technology that is truly independent from both the hardware and hypervisors (Virtual Machine Manager - VMMs) to avoid costly dependency on manufacturers.

Whatever storage solution you or your company decide to implement, a careful planning in advance and skillful experts are requirements. The consequences of neglecting this are particularly evident in the case of data loss due to the failure of individual components or by a corrupt operation. Then it is often the case that not only the damage is much worse due to wrong actions, but also the complexity of a later data recovery can be far more complex – even for Data Recovery experts of Ontrack.