The DevOps Void & Value Stream Mapping
by Niall Crawford
After all, the company just invested “zillions” on a whole bunch of great tools and a cloud framework. Tools that allow you to automatically provision your infrastructure, applications, data and ensure that all your security obligations are met.
Hey! With this new” DevOps toolchain”, we should be moving our releases, from request to delivery, in a matter of minutes. You know … full push-button automation! … environments on demand! … And all that stuff!
Yeah! Right! But no! That’s not how it typically plays out.
In fact, a more realistic example might follow a storyline as follows:
- Project manager raises a request for a SIT environment.
- Request sits in it service management queue for a few days.
- Gets approved & assigned by test environment manager & distributed to engineering teams.
- Sits in the team ITSM queues for another few days.
- Apps team build the package in 5 minutes, but can’t deploy as infra not ready.
- Infra team provisions.
- Test team can’t start testing as data not ready.
- Data team provisions.
- Sorry, testing now too busy with another test cycle.
- Test teams spot a defect with the build.
- Higher priority project comes along and acquires environment.
- Go back to go.
I think one gets the point, the issue with DevOps efficiency is rarely the atomic task.
In fact (as illustrated in the diagram below), if you were to take a step back, you would probably realise the inefficiency is not in the operations themselves (like a build task) but in fact the “void” (or the wastage) in between.
In the above multi-process diagram, we can see the Data team takes:
- 180 minutes of operations (real value),
- 5 days of waiting (wastage),
- Or 2.5% Efficiency (Value Operation / [Value Operation + Wastage]).
Not exactly something you want to write home about. However, not an “untypical” in-efficiency, and a serious opportunity for improvement.
Imagine if for each team could move from 2.5% to just 25%. The benefits would be enormous, and over the lifetime of a project we could be saving weeks, maybe months, of time which translates to early “time to market” and significant IT project cost savings.
Enter Value Stream Mapping
Originally employed in the car manufacturing space, Value Stream Mapping (VSM) is a lean method that helps you better define a sequence of activities, identify wastage & ultimately improve your end-to-end processes. A set of methods that can be applied to any type of operation, including of course IT Environments & DevOps.
Wikipedia Definition: Value-stream mapping is a lean-management method for analyzing the current state and designing a future state for the series of events that take a product or service from its beginning through to the customer.
How do I go about implementing a simple* Value Stream Mapping for DevOps?
- Select the Product e.g. CRM application
- Select the Delivery Process of Interest e.g. Build a test environment
- Gather the SMEs, as VSM is a team event.
- Visually Map Current State (material flow / operational steps)
- Identify Non-Value between steps
- Add a timeline for both Operations (green line above) + Non-Value (red line above)
- Review Value Stream
- Design Future State (Optimize)
- Return to (3).
Tip: When getting started, steps 4 to 8 may initially be completed on whiteboards and simply use guesswork (in place of real data). However, for ongoing improvements, consider using tools that allow you to model your DevOps processes, track the operations and report on stream actuals. As an example, you could use the “Visual Runsheets Manager” functionality inside the Enov8 platform.
Benefits of DevOps Value Stream Mapping
- Baseline existing Operations
- Standardise Operations
- Identify Wastage
- Highlight Operational Bottlenecks *non-automated
- Lift Efficiency *continuous improvement
If you want to know more about how to leverage VSM in your IT Environment or DevOps world then feel free to contact team enov8. Enov8 is a complete solution to IT Environment & Release Operations and embraces VSM as a foundation capability in its overarching environments & operations platform. A capability that ultimately drives being “agile with discipline” or “continuous delivery at scale”.
Innovate with Enov8
If you are interested in learning more about IT & Test Environment Management and IT Release Management, contact us about EcoSystem.
EcoSystem is a fully configurable and easily integratable solution that comes with out of the box “enterprise management” functions that support IT & Test Environment Management, Release Management, Data Management, IT Operations Management, Configuration Management & Service Management.
Niall is the Co-Founder and CIO of Enov8. He has 25 years of experience working across the IT industry from Software Engineering, Architecture, IT & Test Environment Management and Executive Leadership. Niall has worked with, and advised, many global organisations covering verticals like Banking, Defence, Telecom and Information Technology Services.
21APRIL, 2021 by Zulaikha GreerWhat Is Privacy by Design? Millions of dollars go into securing the data and privacy of an organization. Still, malicious attacks, unnecessary third-party access, and other data security issues still prevail. While there is no definite...
31MARCH, 2021 by Ukpai UgochiSo, As the leader of a DevOps or agile team at a rising software company, how do you ensure that users' sensitive information is properly secured? Users are on the internet on a daily basis for communication, business, and so on. While...
24MARCH, 2021 by Taurai MutimutemaKnowledge is more important than ever in businesses of all types. Each time an engineer makes a decision, the quality of outcomes (always) hangs on how current and thorough the data that brought about their knowledge is. This...
15MARCH, 2021 by Carlos SchultsIn today’s post, we’ll answer what looks like a simple question: what is data fabrication in TDM? That’s such an unimposing question, but it contains a lot for us to unpack. What is TDM to begin with? Isn’t data fabrication a bad thing?...
19 FFEBRUARY, 2021 by Carlos Schults "You can't improve what you don't measure." I'm sure you're familiar with at least some variation of this phrase. The saying, often attributed to Peter Drucker, speaks to the importance of metrics as fundamental tools to enrich and...
08 FEBRUARY, 2021 by Zulaikha Greer Data is the word of the 21st century. The demand for data analysis skills has skyrocketed in the past decade. There exists an abundance of data, mostly unstructured, paired with a lack of skilled professionals and effective tools to...