Effective Data Management # 2: Delphix the DevOps enabler

I talked a little last time about what Delphix does. As with all tools, it’s one thing to be able to do something, but it is absolutely something else to have a valid reason to be doing it. With Delphix there are several reasons that you might want to make use of its capability. One of those reasons is to clear the log jam of data in your otherwise beautifully agile DevOps-inspired development and testing process.

I’m sure we all know what we mean when we talk about DevOps, but just to keep us all on the same page here is AWS’s description:

“DevOps is the combination of cultural philosophies, practices, and tools that increases an organisation’s ability to deliver applications and services at high velocity: evolving and improving products at a faster pace than organisations using traditional software development and infrastructure management processes. This speed enables organisations to better serve their customers and compete more effectively in the market.”

Sounds good to me. So good, in fact, that there is a wealth of tooling out there to support organisations adopting this approach from vendors such as Nagios, Monit, ELK, Consul.io, Jenkins, Docker, Ansible, Collectd/Collectl, GitHub and many more.

Unfortunately, even with the best will in the world and the best implementation of DevOps tooling possible, there is always one major hurdle seriously compromising that rapid route to live: data. No matter how quickly you write code, you need data to test it at the unit, system, integration, regression and pre-production levels. That data needs to be truly representative of the real-world environment your newly developed solution is going to operate in.

And there’s the rub. Managing your code set through the route to live can be highly streamlined using the right DevOps toolset. However, provisioning and managing the right data to use along that route is often anything but streamlined, wasting valuable developer and tester time.

No doubt you will recognise this scenario: multiple different teams of people needing access to the test database to perform their different tests. Each test might only last a short time, but rewinding or refreshing the test data takes a long time and often means waiting on the right technical SME to be available to do that for you. So, it doesn’t matter how agile your development and testing is, you still have to wait for the data. This is where DataOps comes in.

DataOps is often narrowly interpreted in an analytical sense. Gartner for example describes it as:

“… the hub for collecting and distributing data, with a mandate to provide controlled access to systems of record for customer and marketing performance data, while protecting privacy, usage restrictions and data integrity.”

We can distil the essence of that to define DataOps as:

…the alignment of people, process, and technology to enable the rapid, automated, and secure management of data;

And this is where Delphix comes in.

If you recall from my last blog, one of the things Delphix enables you to do is to provision virtual copies of production data very quickly. If needed, Delphix will mask sensitive data in the virtual copy. More than that, every developer and tester can have access to a GUI to refresh or rewind their own virtual copy of the data in line with their own testing activity – and completely independent of everyone else’s. This means they can run and rerun tests without having to wait for data to be reset centrally. In fact, specific sets of data can be aligned with specific code drops, so the right data can even accompany a code release all along its route to live.

All of a sudden, using Delphix, you can actually begin to deliver on the shaky promise of DevOps to accelerate the route to live. We have worked with clients to realise this time saving, in one case actually doubling the productivity of developers and testers.

Next time

Next time I’ll go into a little more detail about how Delphix can actually reduce the amount of overall testing undertaken by enhancing the quality of the testing that you actually do.

By Chris Glynn, Principal Consultant

 

Found this interesting? Why not share it: