Until a couple of years ago, many IT administrators and managers were concerned about the risk of losing valuable data because of a sudden failure. That’s why it took manufacturers a long time to convince the public that SSDs are safe to use, even when handling sensitive data.
A NAND Flash chip based SSD is a totally different storage media than the traditional hard disk drive which saves its data on a magnetic plate. It consists of an electronic controller and several storage chips. A hybrid drive – also called SSHD – consists of both storage technologies: A normal magnetic hard disk drive as well as storage chips.
The main benefit of electronic chips for storage is that they are much faster than HDD with a spindle inside. That is due to the fact that a normal HDD consists of many mechanical parts and rotating discs. Also the re-positioning of the read/write head takes much more time than just pushing data through electronic interfaces. Additionally, SSDs have a very short access time, which makes them perfect for being used in environments where real time access and transfer is a necessity.
The downside of SSDs with the NAND Flash based chips is that they have a limited life span by default. While normal HDDs can – in theory – last forever (in reality about 1o years max.), SSDs have a built-in “time of death.” To keep it simple: An electric effect results in the fact that data can only be written on a storage cell inside the chips between approximately 3,000 and 100,000 times during its lifetime. After that, the cells “forget” new data. Because of this fact – and to prevent certain cells from getting used all the time while others aren’t – manufacturers use wear-leveling algorithms to distribute data evenly over all cells by the controller. As with HDDs the user can check the current SSD status by using the S.M.A.R.T. analysis tool, which shows the remaining life span of a SSD.
Usually, manufacturers give an estimate with the so-called terabyte(s) written (TBW)– especially when it comes to enterprise SSDs, but also for consumer versions. Because of the fact that by using Wear-Leveling the data will be distributed evenly over all cells, this figure is supposed to tell how much data can be really written in total on all cells inside the storage chips and over the whole life span.
A typical TBW figure for a 250 GB SSD lies between 60 and 150 terabytes written. That means: To get over a guaranteed TBW of 70, a user would have to write 190(!) GB daily over a period of one year (In other words, to fill two thirds of the SSD with new data every day). In a consumer environment this is highly unlikely.
Samsung states that their Samsung SSD 850 PRO SATA, with a capacity of 128 GB, 256 GB, 512 or 1 TB, is “built to handle 150 terabytes written (TBW), which equates to a 40 GB daily read/write workload over a ten-year period.” Samsung even promises that the product is “withstanding up to 600 terabytes written (TBW).”
A normal office user writes approximately between 10 and 35 GB on a normal day. Even if one raises this amount up to 40 GB, it means that they could write (and only write) more than almost 5 years until they reach the 70 TBW limit.
SSDs last even longer than promised
The good news: These manufacturer figures are even lower than the real TBWs detected in a long-term test conducted by Germany´s most respected IT and Computer magazine c´t and the Heise publishing company. In the magazine´s test, they bought two SSDs each of the 12 most popular products available in 2016 and tested those for one year until the end of June 2017. The SSDs that were tested were OCZ TR150, Crucial BX 200, Samsung 750 Evo, Samsung 850 Pro, SanDisk Extreme Pro, and SanDisk Ultra II.
The experts from the magazine wrote bits of data on the SSDs using a special tool programmed by one of their experts to both analyze the performance as well as to constantly fill the disks with data.
The outcome of the tests conducted were astonishing: All of the drives tested were able to write more data than what was promised by the producer. Even cheaper drives were able to write more data than promised: The Crucial BX 200 drives were able to write 187 TB and 280 TB – that is more than 2.5 times the figure promised.
One of the Samsung SSD 850 PRO drives achieved a figure of 9.1 petabytes of data written! That’s 60 times the TBW figure Samsung promises on their data sheets. The other Samsung product – the Samsung SSD 750 Evo – was able to write 1.2 petabytes of data, which equals (in theory) to more than 80 years of constant writing. However, the pro models showed why their price is higher: None of them did write less than 2.2 Petabyte of data.
The test clearly proves that the fear of a limited life span is highly exaggerated in most aspects. But there are other threats.
If they last that long, where are the dangers?
Even though these tests clearly shows that SSDs last longer than expected, using this storage medium still poses a serious threat: Recovering data from failed SSDs is still more challenging than HDDs for data recovery service providers because getting access to the device is often difficult. When the SSD controller chip is broken, access to the device and the storage chips is impossible. The solution to this problem is trying to find a functioning controller chip that is identical to the bad one and to remove and exchange it with the identical one to get access. What sounds quite simple is a difficult task in reality. This applies also for trying to access data from faulty storage chips. In many cases data recovery experts like those from Ontrack are able to reset data. In the last few years, Ontrack developed a lot of special tools and processes to master these challenges and have successfully recovered lost data.
Remember: In case of data loss from SSDs, the best idea is to contact a professional data recovery service provider. When it comes to a physical fault, there is no possibility for a user to recover or rescue their data themselves. Also, when the controller or storage chip is malfunctioning, the attempt to recover data with a specialized data recovery software tool is even more dangerous. It can lead to a permanent data loss with no chance of recovering the data ever again.
Picture copyright: Kroll Ontrack GmbH, Böblingen, Germany
Michael Nuncic is Marketing Communications Manager at the German Ontrack Data Recovery office in Böblingen for more than 5 years. Highly experienced in computer, network and software topics, he is a professional editor for blog and technical articles for almost 20 years now.