Looking at the Hidden Costs of Increasing Data Storage
Large-scale IT environments have the resources to manage all aspects of a network expansion, including the initial analysis, equipment installation and wiring, and proper access management to users. In smaller environments the planning may not go beyond the immediate reaction to the user’s needs—that is, “we’re out of space!” While the size of the environment may determine how storage needs are addressed and managed, such things as proper equipment cooling, storage management software that allows for scalable growth (SRM), disaster recovery (including backup contingencies), and data recovery concerns apply to IT environments of every size.
In one scenario, picture a small business with five desktop machines. Despite following careful data compression procedures and rigorous archiving of old files, their system is running out of space. They have a small file server sitting near the users’ desks. Can the business owner upgrade the file server with a bigger hard drive or should he add a separate rack of inexpensive drives? How much space will they need? Will a terabyte be enough? What if they need to upgrade in the future? How hard will it be? What other hidden costs are they going to run into?
In another scenario, a business that uses 30-40 desktop machines has a file server located in a separate room with adequate cooling, user access management, and a solid network infrastructure. But they too are running out of space. When they plan for an expansion, what hidden costs will they need to consider?
In addition to equipment investment, there are many hidden costs to consider when determining storage needs and subsequent management. Following are some hidden costs identified when it comes to storage.
How can you get the most out of existing storage space, not allowing it to fill up so quickly? In conjunction, how do you prevent your storage space from running out before the full life expectancy is realized? This is where storage management software, such as SRM and ILM, enters the picture. Storage Resource Management (SRM) software provides storage administrators the right tools to manage space effectively. Information Lifecycle Management (ILM) software helps the management of data through its lifecycle.
While a viable solution, SRM and ILM software may not cover all the needs of a business environment. SRM and ILM software are designed to manage files and storage effectively, and with a level of automation. Beyond this is where good old-fashioned space management is required. Remember the days when space was at a premium and there were all sorts of methods to make sure that inactive files were stored somewhere else—like on floppies? Remember when file compression utilities came out and we were squeezing every duplicate byte out of files? Those techniques are not outdated just because the cost per MB has dropped, or tools exist to help us manage data storage. Prudent storage practices never go out of style.
Manufacturers are working hard to optimize the performance of their machines, yet server power consumption remains on the increase. What will be the power requirement of your company’s new storage solution? Luiz André Barroso at Google reports that if performance per watt is to remain constant over the next few years, power costs could easily overtake hardware costs, possibly by a large margin.
Power consumption can be a hidden fixed cost that may not have been expected with the expansion of storage space. Especially when consider the fluctuating costs of energy, unanticipated power usage increases can be an expensive budget buster affecting the entire enterprise.
Closely related to power consumption is the need to keep cool the more powerful processors found in the latest machines. Both the performance and life expectancy of the equipment are related to the component temperature of the equipment. Ever since the Pentium II processor in 1997, proper heat dissipation using heat sinks and cooling fans has become a standard for computer equipment. Today’s high performance processors, main boards, video cards, and hard drives require reliable temperature management in order to effectively and efficiently work day in, day out.
If you or your client’s storage requirements grow, proper ambient server room temperature settings are going to be required. Adding such a room or creating the necessary environment may add build-out costs, not to mention increase those power consumption and energy costs mentioned about earlier.
With proper heat dissipation and cooling comes noise. All those extra fans and cooling compressors can create a noticeable amount of decibels. A large-scale IT environment has the luxury of potentially keeping its noisy machines away from the users. However, in a smaller-scale business or home business, some have found the sound levels generated by their storage equipment to be intolerable or at minimum concentration breaking. Such noise makes surrounding areas non-conducive to work and productivity, hindering employee’s ability to simply think. When increasing your data storage, make sure the resulting noise generated is tolerable. Be sure, too, that noise suppression efforts don’t interfere or defeat heat dissipation or cooling solutions.
The equipment investment for the expansion may be significant, but how does this increased storage relate to administrative needs? Should management hire a network consultant to assess user needs, then install, setup, and test the new equipment? Or can the company’s in-house network administrator do the work? A small company has a risk because although they might not be able to afford to have a professional assessment and installation, they may learn the hard way with an inexpensive solution the old adage of “you get what you pay for.”
A non-professional might misdiagnose storage usage needs, set up the equipment incorrectly, or buy equipment that isn’t a good fit for the environment. Such unintentional blunders are why there are certifications for network professionals. Storage management is not as simple as adding more space when needed, it is a complicated, multi-layered endeavor affecting every aspect and employee of a business.
Although using the skills of a professional greatly increases the success of the storage expansion, it will raise the final cost. When considering the monetary expense, businesses must also remember to consider how much other ‘costs’ - overall risk, loss of data availability, system downtime if the implemented solution fails - they can afford.
How does your business currently manage backup cycles and corresponding storage needs? Do you store your backups on-site, or do you have a safe alternate location at which to store this precious data? Natural disasters such as fires and floods, and extreme disasters like Hurricane Katrina are wake up calls to many resistant to the idea of offsite data storage. Offsite data storage may be as simple as storing backup tapes off site or archiving data with data farms for a monthly space rental fee, or as complex as having a mirrored site housing a direct copy of all your data (effective but costly).
Whatever backup management and storage process utilized, backups created should be tested, as well as the backup system with the expanded storage to make sure it’s actually backing everything up. There is nothing worse than relying on a backup that doesn’t work, was improperly created, or doesn’t contain the vital data your business needs.
Databases created as a result of daily business activities can be staggering (as referenced in the earlier example of one large retail corporation’s generation of a billion rows of sales data daily). This activity can result in large amounts of data being stored. One way to optimize database performance is by separating the database files and storing them in three separate locations. In this process, data files are stored in one location, transaction files or logs in a second location, and backups in a completely different location. This not only makes data processing more efficient but prevents having an “all the eggs in one basket” scenario, beneficial when experiencing a process disruption such as equipment failure.
Undertaking this type of database optimization involves the aforementioned planning and equipment costs. But keep in mind how database information has reached into all areas of the business - customer information, billing information, and inventory management information - and how vital it is that this information be protected. Hidden costs associated with protecting database information can escalate quickly.
Installation and cabling
The old trend was a standalone unit where the processor and storage were one system. Now the trend is to build a separate networked storage system that can be accessed by many users and servers. In general, there are two types of separate storage systems, the storage area network (SAN), and the network attached storage (NAS).
The separate storage system offers a number of advantages, including easier expansion. The consideration however, is that you will need the network infrastructure to support a separate storage system. In other words, if your storage system is in a separate building, you will need faster network connectivity to avoid a “bottleneck” in communication between the server and the storage device.
A disaster recovery plan encompasses everything that could happen if there is a system failure due to destruction, natural disaster, fire, theft or equipment failure. Part of a good disaster recovery plan includes a business continuation plan, that is, how to keep the business going and doing business despite the disaster. When planning for a data storage expansion, the disaster recovery plan should be reviewed to make sure the company’s data is accessible in the event of a contingency, and be closely aligned to business continuity planning and efforts.
Data recovery can become a hidden cost if not planned for. Every business continuity plan and disaster plan should include professional data recovery services as part of their overall solution.
Ontrack has successfully recovered data for customers who have lost data due to failures encountered during storage space migration or expansion, mirroring failures, system shutdowns due to environmental abnormalities, natural disaster, backup inconsistencies, and software and database corruption.
As you can see, there is much more to scalable growth than just adding more storage space. Although prudent planning and every precaution in instigating and undertaking an effective storage management solution has been enacted, failures and unforeseen circumstances can and do occur. Simply put, despite the best preparation disasters do happen. Ontrack Data Recovery is your partner for success when you or your users experience data loss scenarios and is here to assist with the recovery and restoration of the original data.
University of Berkeley – BaBar Database Project http://www.slac.stanford.edu/BFROOT/www/Public/Computing/Databases/proceedings/CIDR05.pdf
Google System Engineer, Luiz André Barroso’s article about operating costs of computer equipment https://queue.acm.org/modules.php?name=Content&pa=showpage&pid=330
SAN Planning Strategies (chapter one of online book) http://www.informit.com/imprint/index.aspx?st=61085