Select Page

Data Security (Scale and Bees)

25

AUGUST, 2019

by Jane Temov

Data security, The problem is scale & a lack of bees

One of the biggest challenges of securing one’s enterprise data is the sheer volume.

Think about it. Hundred (perhaps Thousands) of Applications, Thousands (perhaps Tens of Thousands) of Instances across Development and Test and within each millions of data point, many of which contain PII (Personally Identifiable Information).

 

Sounds scary huh.

And then, even if you know what to secure (which is a rather “big if”) and independent of what expensive masking tools you have (IBM Optim, Informatica, Compuware, CA etc), there is the task of building the remediation scripts. Which typically take months (8-12 weeks) per platform and is often prone to error and omissions a finally executing them.

A set of tasks that are usually done by a centralized team of data “experts”, with a single TDM tool and delivered in a sequential fashion.

Do the Maths!

The “Small Bank of Narnia” with 100 key platforms would take 16* years to be compliant.

*100 Platforms x 2 months / 12 (months in a year)

Or more likely, simply due to “do-ability” (or lack of “do-ability) the organization will just do half a dozen important ones and hope audit, compliance and/or the regulators don’t notice.

Centralization is Bad

However, the problem is here is not just scale.

The biggest issue is the inability to parallelize (federate) the effort.

Imagine each of the 100 platform teams/tribes could do the masking themselves.

  • The skills & method to Understand Data
  • The skills & method to accurately remediate the Data
  • The technology to execute these exercises in Parallel

Well then one might say, optimistically, that the tasks would go from 16 years to say 6 months.

Our Eureka Moment

These somewhat “obvious” observation lead to our Eureka moment and design or  Data Compliance Suite. DCS was designed& built to go “against the grain” of traditional TDM tools and methods and deliver four key things:

  1. Simplicity of Use

No need for experts. Promoting the opportunity for tribes/ teams to “do it themselves”.

  1. Hands-Off

Encouraging automation of historically manual (or semi manual) data security tasks

  1. Parallel Data Ops

Promoting the ability to do Profiling, Masking, Validation in parallel manner.

  1. Enterprise Visibility

Providing enterprise view of coverage & compliance (opposed to traditional blind-spots).

Our Architecture

Enov8 DCS is a new generation Test Data Management / Data Compliance Solution that was built from the ground up to address the needs of both Technical (engineering) & Non-Technical (audit & compliance) staff alike.

Designed with a pleasant front-end and with “guard-rail like” navigation,

DCS takes the users through a best-practice Data Securitization journey.

Which includes:

  • Use of “automated intelligence” to understand your Data & Identify Risks.
  • Automatic (on the fly) build of masking or encryption scripts.

Yep no more centralized team taking 8 weeks to engineer “often error-prone” solutions.

  • Ease of execution, both Just in time & scheduled.
  • Automatically Validating (Testing) Data is Compliant and void of PII.
  • Delivery of Compliance Dashboards & Reporting showing coverage and status.

Giving Compliance, Security & Audit comfort that IT is moving in the right direction.

And for the more technically minded.

  • Use of “Worker Bees” to spread DataOps load across the network

No need to wait for one application to finish masking before you start the next. The Enov8 Worker Bees (Battle Bees) can execute hundreds of Data Operations in parallel. Worker Bees can be placed anywhere on your Network (e.g. across platforms, subnets and clouds) to leverage parallel processing and reduce latency and storage transfer costs.

  • Provision of Rest-API & Webhooks so compliance can be added to your delivery-tool chain.

To Summarize

In the “good old days” we all had a single team of “subject matter experts” to mask data. And in a company with a handful of platforms, that would probably still work. However, organizations IT & Test Environments are complicated nowadays. Today even medium sized organizations can have hundreds of data platforms with Gigabytes or Terabytes of data. If your organization want to be “truly” compliant, there is a need to move away from traditionally centralist and serial methods. It is time to automate, federate and parallelize your Data Ops.

Learn more about DCS.

Jane Temov

Jane is an experienced IT Environments Management & Data Evangelist. Areas of specialism include IT & Test Environment Management, Data Securitization, Release Management, Service Resilience, Configuration Management, DevOps & Infra/Cloud Migration. 

Relevant Articles

Data Compliance: A Detailed Guide for IT Leaders

Data Compliance: A Detailed Guide for IT Leaders

31MARCH, 2021 by Ukpai UgochiSo, As the leader of a DevOps or agile team at a rising software company, how do you ensure that users' sensitive information is properly secured? Users are on the internet on a daily basis for communication, business, and so on. While...

What Is IT Operational Intelligence

What Is IT Operational Intelligence

24MARCH, 2021 by Taurai MutimutemaKnowledge is more important than ever in businesses of all types. Each time an engineer makes a decision, the quality of outcomes (always) hangs on how current and thorough the data that brought about their knowledge is. This...

What Is Data Fabrication in TDM

What Is Data Fabrication in TDM

15MARCH, 2021 by Carlos SchultsIn today’s post, we’ll answer what looks like a simple question: what is data fabrication in TDM? That’s such an unimposing question, but it contains a lot for us to unpack. What is TDM to begin with? Isn’t data fabrication a bad thing?...

Top TDM Metrics

Top TDM Metrics

19 FFEBRUARY, 2021 by Carlos Schults "You can't improve what you don't measure." I'm sure you're familiar with at least some variation of this phrase. The saying, often attributed to Peter Drucker, speaks to the importance of metrics as fundamental tools to enrich and...

Structured Versus Unstructured Data

Structured Versus Unstructured Data

08 FEBRUARY, 2021 by Zulaikha Greer Data is the word of the 21st century. The demand for data analysis skills has skyrocketed in the past decade. There exists an abundance of data, mostly unstructured, paired with a lack of skilled professionals and effective tools to...

Enterprise Environments: Understanding Deployment at Scale

Enterprise Environments: Understanding Deployment at Scale

04 JANUARY, 2021 by Ukpai Ugochi Have you ever wondered what would happen if you mistakenly added bugs to your codes and shipped them to users? For instance, let's say an IT firm has its primary work tree on GitHub, and a team member pushes codes with bugs to the...