Cloud computing has become more than just a technology trend. According to Forrester research, the global public cloud market will raise from $146 billion this year to $236 billion in 2020. Cloud computing offers many benefits to companies: the need for hardware and software is reduced drastically, costs for energy, employees, software licenses can be scaled down almost immediately after data or applications have been moved into the cloud.
Types of cloud
Most cloud providers offer all three kinds of services – Infrastructure as a Service (IaaS), Platform as a Service (PaaS) und Software as a Service (SaaS). Additionally there are three different kinds of cloud variants to choose from: public, private and hybrid cloud. While the public cloud is a term for moving all data outside the company to a cloud service provider platform, private cloud is managed entirely by the company using cloud technology. To get the best of both strategies a lot of companies use the hybrid cloud approach. This involves storing some files (or using some services) over the internet with a public cloud service, while more business-critical data, applications and services are run inside the company on the private cloud. The challenge when running a hybrid cloud is the clear separation of the two processes in business-critical and non-critical workflows. This can only be achieved when all files that are available and processed are classified consistently as business critical or not.
Another aspect that private users as well as companies should be aware of is that cloud service providers mostly use different technologies. Even in today’s world, almost every cloud service provider uses their own technology to offer their clients data storage and accessibility over the internet. Amazon Web Services for example are based on several technologies they combine to their cloud services offer. EC2 (Amazons Elastic Cloud Compute) is a service to run applications in the cloud on virtual servers and is based on either Linux or a Windows Server distribution, whilst S3 is Amazon´s own file hosting service. The company doesn´t publish any details about its design or structure in public, but it is clear that it manages data by an object storage architecture. For their file server cloud services Amazon offers several API´s – Application Programming Interfaces – to link for example company backup software products like Commvault or Veritas NetBackup to S3. Another big player on the scene is Microsoft Azure. It uses several Windows Server and .Net technologies that make it difficult to run non-Windows applications. Which brings us to the next challenge…
There is one factor that should be seriously considered before deciding for a cloud service provider and before moving data, services or applications into the cloud: The problem of the so-called ‘vendor lock-in’.
Vendor lock-in means that a customer of a cloud service often has to stick to this vendor because of the massive challenges that appear if a migration of data, services or applications to a new cloud provider is intended. This vendor lock-in challenge is caused due to the fact that, as described before, cloud service providers use so many different cloud platforms.
If you only use the cloud as a ‘live’ secondary backup storage, you just need to change the destination to the new cloud provider and store the new backups on his cloud space. Additionally the older on-premise backups should be copied to the new provider. Or if you only have the backups or data in the cloud (with the old cloud service provider, which is not a good idea anyway) you could use several tools that are available on the market to migrate and transfer the data. But beware that most of these tools are only useful when you have not a very large amount of data to be migrated; otherwise you end up transferring data for days, months or years!
Wait before deleting
When the migration is finally completed and everything works fine, the old backup files on the existing cloud space of the former service provider can be deleted. But again, beware: a deletion should only be started when there are enough onsite copies of your backup to meet your retention policy. Besides that, you are all set and ready to go.
While migrating data from one cloud service provider to another technically might not be a problem, it will be most likely be (when there is a large amount of data) a timely and costly effort. The same can be said about changing network services from one provider to another. The real challenge still remains; the migration of applications from one cloud service to another.
In many cases the technologies cloud providers are using are not the same so that it is almost impossible to seamlessly migrate an application that was customised to run on one cloud service provider´s platform technology to another cloud service. In most cases the application must be programmed and customised again to meet the needs of the technology used by the new provider. Since the usage of open source cloud platform technology like OpenStack is not very common among cloud service providers, companies are forced to invest a lot of money into application development when they intend to use their custom build, business-critical applications on a new cloud platform again.
With most cloud providers the importing of data and applications is an easy act, but switching applications to a new cloud platform is way more complicated and costly. Enterprises that are evaluating cloud solutions should give some more thoughts to a back out strategy when they evaluate cloud vendors. Companies should be aware of how complicated it could be if they would have to get out of the deal for any reason, including dissatisfaction, high costs, cloud provider going out of business or changing strategy, poor performance and more. All of these factors should be considered by companies at best as early as their beginning of their cloud endeavor.
Does your organisation use cloud storage? Have you ever had difficulties when migrating data? Let us know by commenting below, or tweet @DrDataRecovery