Effective Data Management #5: Accelerating the route to the Cloud

ecs-admin 16th October 2019

Modern businesses demand agile IT solutions. For many IT departments, this translates into pressure to migrate workloads to the cloud to realise considerable productivity gains.

But embarking on a move to the cloud throws up many potential pitfalls – for example:

  • Inadequate testing, including cutover procedures, which can cause the cloud migration to fail;
  • Over-long migration windows, which can result in lost revenues due to critical service outages;
  • Poor access to current data for testing, which can lead to unexpected application failures;
  • Weak control of access to sensitive data, which could lead to a serious data breach;
  • Business-critical legacy applications that are too costly and complex to migrate;
  • Accrual of higher than planned cloud storage charges by creating and maintaining multiple copies of large files and databases.

You might be wondering how using Delphix is going to help here.

In earlier blogs I described how Delphix lets you provision thin but full-sized, virtual copies of heterogeneous data sources and how easy it is to mask sensitive data consistently across the data estate, effectively eliminating the risk of a data breach. Together, these features can facilitate the rapid provisioning of any number of thin, virtual copies of masked data that improves the quality of testing and also ensures that no sensitive data escapes from the live data source, even in the cloud.

Delphix has the out-of-the-box capability to replicate data objects between different Delphix Engines running on the same site or at different locations. Replication can be run on an ad hoc basis, or according to a predefined schedule.

The Delphix Engine can currently be hosted on the AWS or Azure cloud platforms and will also be supported on GCP in the next major release, expected later this year. It is very easy to configure replication of any data source from your Delphix Engine running on premise to one running in the cloud.

Replication is useful when you want to fully segregate your non-production and production data environments. A replicated copy can be used to provision any number of virtual copies for development and testing purposes as well as a full-sized physical copy using the Delphix Virtual to Physical (V2P) option. In this scenario, because only the masked data exists at this location, there is zero possibility of a data breach from this location.

For cloud migrations, this allows you to perform comprehensive testing of the application, and of the migration and cutover procedures in the actual target environment. With the ability to quickly stand up virtual databases, teams of developers and testers can work in parallel and can quickly rewind data to re-run tests.  All of which accelerates delivery of the migration project, while avoiding excessive storage charges and with no unnecessary risk to your sensitive data. It also makes refactoring and testing of complex legacy applications much more feasible.

But is not only masked data that you can replicate. During replication, data is encrypted while in transit. This provides a secure mechanism for transferring the source data into the cloud as a dSource, which is a one-off process. Going forward, only the change logs are ingested to maintain data synchronisation. Replication follows the same principle. This means that you can schedule the transfer of data to the cloud to best suit operational and project needs. And that you can pre-load the cloud with the bulk of your data ahead of the actual cutover date.

It is probably unlikely that you will want to run your production workload in the cloud from a virtual database. Therefore, you will still need to copy the virtual database to a physical database in the cloud. Because you have been able to transfer the bulk of the data into the cloud in advance, you will need less of an outage to complete the data transfer to the cloud’s physical storage.

And the proof is in the pudding. Delphix customers have typically shrunk their migration timetables by more than 50%, and eliminated cutover downtime and huge amounts of data risk. Virtual data technologies are incredibly powerful and are increasingly important for any company looking to increase its business agility by embracing the cloud.

By Chris Glynn, Principal Consultant 

Found this interesting? Why not share it: