HomeDevelopmentCloud NativeDoes DevOps needs a data model?

Does DevOps needs a data model?

As product owner responsible for a number of development / CICD tools, I since long have a feeling that we do not leverage the data we have in all our tools and the data does not always nicely align with the digital products that my internal customers produce.

The data should be somehow be linked such that there is a better traceability, maybe some cross tool engineering rules can be established and an improved insight in the state of affairs can be reached. A long time I felt alone in this, but there seems to be organizations in the software industry that reached the same conclusion the last few years. Of course, that more people share a similar thought does not necessarily mean it is valuable. So let’s investigate.

Industry trend?

Already some years ago the IT4IT consortium was erected. If I am well informed, it originated from a cooperation between the Shell IT department and HPE. Later the Open Group adopted the initiative as new standard, many parties joined the effort and version 2.1 is the latest and greatest (3.0 expected this year). The core is a common data model that spans the whole of IT. Within that model there is the “Service Model Backbone” that represent different abstracts / stages of a service that link the whole model together. How this model should be implemented, the reference architecture does not describes. It assumes that vendor specific system(s) of record will do that for you, or that you build your own I guess.

December last year (10–12–2020) Cloudbees’ Shawn Ahmed announced two new modules on their Software Delivery Management product and made several references to a common data model as the core of their product.

“… A shared vision of a system, that does link your entire organization. That means all teams, it means all tools, all processes to a common data model and do it right out of the box. Do so with pre-build integrations to the DevOps tools you love to use everyday. And at the heart of the software delivery management structure we build a system of record. Like a common data model which unify and correlate data across your DevOps tool chain.

Now this common data model, this is the real backbone of the Cloudbees Software Delivery Management and every new module we build on this platform, we roll out and when we roll out, builds on that structure. Now once the data from your tool chain is normalized in this common data model, it gives us the ability to provide actionable insights we need to run an efficient engineering operation and do so at any scale, especially enterprise scale.”

I do not have access to the product and the available public documentation does not reveal how the common data model looks like, but clearly Cloudbees shares the vision of the IT4IT Reference Architecture.

Mik Kersten wrote the book Project to Product, which I much enjoyed reading. Mik introduces his Flow Framework, which touches the same subject (and many others). The framework identifies, amongst other things, that you require a model of artifacts (artifact network) and a landscape of tools and connections (tool network) and an integration model between them.

ServiceNow has an elaborate platform to build your own IT4IT implementation that originates from an ITIL / Service Management background. ServiceNow is a major contributor to the IT4IT Reference Architecture. ServiceNow works on it’s own reference model for their platform called Common Service Data Model (CSDM). The intention is to limit the need for customer customizations. Recently version 3 was launched.

Where the IT4IT Reference Architecture is very specific and detailed about which objects there are, the attributes they have and the relationship that at least should be there. Cloudbees does not reveal much, but given the companies background I assume it strength lies on the development side. Mik’s book discusses general concepts (a framework) and does not show factual choices. ServiceNow’s strength lies in the service management side and slowly moves in the development direction.

Why a data model?

The common denominator for the mentioned initiatives is to streamline value delivery downstream and flow of feedback upstream. Value means, business value, e.g. increase of revenue, increase of customer satisfaction, decrease of costs, and/or reduction of risk. In the day and age of digital transformation the gap between business value and IT value becomes smaller and smaller, and therefore that the IT value becomes increasingly important.

While a single DevOps team may be well aware of what it delivers and what value it brings, from their perspective, at enterprise scale with many products and many teams it will be a different ball game. In the whole process of (IT) value delivery, many tools may be involved, each holding a essential data. Many tools are functionally oriented or single team oriented or project oriented or even a mix of dimensions. Combining products, teams and tools leads to a multi-dimensional data space with IT4IT data.

If you are in a highly regulated industry like finance, pharmacy or defense there will be other stakeholders to satisfy then customers and employees. You may have to be able to demonstrate full traceability, from stakeholder requirement to production and from incident report back to a product backlog.

So in a situation where an enterprise overview of value delivery is wanted a data model that connects the dots may be actually a good idea.

An approach; be opinionated

What if you think this is a good idea, what would your journey look like? Getting an enterprise overview even for one tool can be nearly impossible if you let every team configure the tool itself. If team autonomy with respect to tool choices and tool configuration is a stronger force in your company than the enterprise requirements or regulatory requirements, you have to let go of this idea.

Having an effective data model means being opinionated. A central governance body is required that strikes a balance between enterprise requirements and team requirements. Being opinionated means making clear choices. Clear choices need to be laid down in standards and guidelines, and standards and guidelines are to be implemented in both the organization and in the tools. In other words, you have to develop your own IT4IT model. Only clear choices allow to map your central data model to data in each specialized tool.

And all the above still does not include a factual integration between your tools, it is just a prerequisite. Some major technical choices still have to be made. Although the technical implementation can be a lot of work in itself, aligning the organization is the most challenging and the biggest task. IMHO the model must be opinionated for getting the best insights and aligning on strong opinions is hard work.

Integration

If you are developing an opinion, your companies IT4IT model, and are laying a basis for standards and guidelines, you can start thinking on how to integrate your tool landscape. The IT4IT Reference Architecture will not supply you with a solution. The reference architecture is described in levels of abstraction and the detailed data model is presented at level 3. The reference architecture suggest that a level 4 shall be implemented by a vendor.

“Abstraction Level 4 is where the architecture becomes more product design and implementation oriented. — IT4IT Reference Architecture”

However, most tools are developed not having any common domain model in mind, and even existed long before the inception of this reference architecture. You probably have to live with the truth that in your company many tools are already there, each with it’s own conceptual model.

There are a number of approaches. You can accept your current tool landscape and build your own integration. You can accept your current tool landscape and and add a tool specialized in integration of tools. Or you can search for a product or product suite with a model that can form the linking pin, the backbone, and that offers it’s own opinionated model which you find agreeable. You probably end up with a mix. Part of the landscape and the solution should of course be solution for gathering measurements, forming metrics and doing analytics.

Conclusion

To perform DevOps at scale, effectively maintain your digital product portfolio and optimize your value delivery, this requires insights that can only be achieved by gathering consistent and high quality data. In a highly regulatory industry external stakeholders may demand certain insights as well. Given the average tool landscape which is required for value delivery this can only be achieved by implementing a form of integration based on a common conceptual data model. You can use the IT4IT Reference Architecture as a start, buy a product with an implicit model, buy a tool that integrates others or develop your own. Whatever you do it will be hard work and it requires perseverance. When implementing, think big act small is probably a good advice.

This article was previously published on Medium.

NEWSLETTER

Receive our top stories directly in your inbox!

Sign up for our Newsletters

spot_img
spot_img

LET'S CONNECT