As I start to write my first blog piece about the capabilities of some rather smart data virtualisation technology brought to us by Delphix, it strikes me that it is probably unwise to assume all readers will know exactly what Delphix is or what it does. So perhaps a quick overview of Delphix will help to get everyone on the same page.
I’d like for readers to understand the concepts behind Delphix without having to be too technically minded. I am very excited about the potential Delphix has to transform the way we have traditionally managed access to copies of large volumes of data, including personal and other sensitive data. And I hope that through this series of blog pieces you might also begin to share my enthusiasm!
Delphix is a clever product that allows you to:
- provision lightweight (virtual) copies of production data in minutes, while keeping them in sync;
- secure sensitive data in line with security policies and ensure regulatory compliance; and
- move and manage data from any environment – on premise, cloud or hybrid.
Provision lightweight (virtual) copies
Data from multiple different data sources (including many RDBMS as well as individual files) can be pulled into and stored in the Delphix Virtualisation Engine. It does some intelligent proprietary compression so the data in the engine typically only uses a third of the original source data; better still, it can be configured to keep itself in sync with the original data source.
Now for the clever bit: a virtual mapping of the data in the Delphix Engine can be presented to a target server where it looks and operates exactly like a copy of the full-sized original source data. You can provision as many of these virtual copies onto as many target servers as you need to and all of them map back to that single copy of the data in the Delphix engine.
What is really clever is that Delphix will make sure any activity or changes to the data in one of the virtual copies is only ever visible in that copy. In other words, all the copies can run simultaneously, and all with totally different workloads, and none of them will interfere with the others.
You can create a full-sized copy of multi-terabyte sized databases quite literally in minutes. You also get the ability to refresh, restore and rewind changes to your individual virtual copy in minutes.
It is both a great time saver and a great space saver, both of which make it an enormous cost saver.
Secure sensitive data
So far so good, but it gets even better. There’s also an optional data masking component that is fully integrated with Delphix. Delphix Masking allows you to mask data in multiple different ways to suit most requirements. There are over 20 default masking algorithms out of the box, as well as the ability to create as many case-specific masking algorithms and rules as you need.
Delphix is sensible enough to ensure that when you mask a value in one place and then apply the same rule to a value stored in another place, the same value will always get masked in exactly the same way with exactly the same result. So even if you store, say, the customer name in several different databases, the name will always be masked in the same way to keep the referential integrity of all your data intact.
The great thing is that whatever the masked value, Delphix ensures the masking is one-way, so no one is able to trace the masked value back to the original value. (Actually there is one specific type of masking algorithm that will allow you to tokenise values and to reverse that masking if you have the token value, but that is to cater for very specific use cases.)
If you combine the rapid creation of virtual copies of data with masking, just think how powerful a tool that is. You can now de-personalise/de-sensitise copies of your live data and provision any number of safe copies for non-production use in a matter of minutes.
Move and manage data
Delphix can be deployed on any x86 platform, including in the cloud. Using Delphix you can replicate either masked or unmasked data between different Delphix Engines. Data is encrypted in replication, so if you want to migrate data to the cloud, for example, Delphix can provide the conduit. It is even possible to provision a new physical copy of the data from any virtual data instance you have.
As for managing data, there is a user-friendly GUI that allows anyone – testers developers, analysts etc. – using the virtual copy to perform a whole host of data management tasks for themselves. This includes: refreshing the virtual copy from the master source copy; rewinding the virtual copy to any point in time; setting book marks; and making branch timelines for the data.
I’m sure this will have given you food for thought on the many different possible use cases for this powerful tool. In my next post I’ll be discussing how adopting a DataOps approach using Delphix can unblock the route to live wherever DevOps processes have been adopted.