Perspective
I was fortunate to have been a member of a terrific team of hard-working people that while performing the jobs for which they were hired, also established a master data management program. This effort originated from within our lead-to-cash (LTC) program and required over a year to achieve and then another year to adopt organizationally until it was an ongoing operation. We leveraged Business Architecture to help us assess the full breadth of investment requirements – alignment with strategy, within its resourcing constraints and among its existing priorities – and to gain stakeholder support for the process.
We employed Business Architecture as a framework for comparing competing priorities for Master Data Management (MDM)
We began with updating a capability map – what an organization must be able to do to achieve its strategic objectives – for our industry that depicted the structure of the business we were in. We applied maturity scores to each for our firm and it became clear that low maturity correlated with under-performance operationally. We identified the LTC business objects that would need to be managed in order to compete and scale in this industry effectively.
We determined that among those business objects of LTC, opportunities, deals and sales orders were already under management. This means they were supported with institutional adoption of dependable tools and a set of repeatable operational practices that were in some stage of emergence and adoption. There was a pervasive belief, however, that these business systems were inadequate and worth replacing. The business architecture suggested a different, less obvious, opportunity for improvement later in the LTC cycle. The operational practices, tooling and data quality in the post-deal stages of the LTC value stream, namely fulfillment, operations and customer support, usage monitoring and billing, were of a higher priority. It was from these functions where imperfections in our processes were evident to the customer, and where investment had fallen behind the growing need for them.
The assignment of rights of use to customers (that is, ‘entitlements’), customers and SKUs, underpin most of the operations of the company and yet they were underserved institutionally. Management of these data elements was not systemic and yet was critical to successfully delivering, supporting and billing customers for these services. Immaturity in these capabilities – meaning that processes, procedures and policies were not yet repeatable – compromised the integrity of the data they all shared with each other, which was a source of drag, unwanted costs and customer dissatisfaction. They each were foundational to effectively operating and growing our business, so they were early candidates for master data management.
By assessing the demands of the enterprise in terms of capability maturity, we were able to have conversations across the firm, distinguishing what was urgent as opposed to important in a non-subjective manner. This allowed us to prioritize our investments in a larger context than immediate pain relief. We could relieve pain and invest in the future equally in the context of a strategic framework. Most importantly, we were able to do so with buy-in from our stakeholders.
We executed following a Master Data Management play book and were successful
A legacy of fast growing organizations is often a constellation of systems, each the purview of a distinct organization serving a different function. As is often the case, each believes that the data it housed was ‘created here’, and each group claimed ‘their’ system to be the ‘right’ system to be’ the source’. These assertions were based on trust in their own as opposed to ‘the other’. The natural trials and tribulations of rapid growth in an innovation-fueled environment had produced a culture that had become (for lack of better word) tribal. Thus, negotiating which of these systems would be anointed to serve as ‘source of truth’ was inherently political.
Once a ‘system of record’ was designated – through a transparent consensus building (and by-no-means linear) process – we moved on to reviewing the data from all sources (including historic systems). We would methodically gauge quality and complexity of resolution against urgency, and then perform data forensics as a prerequisite to cleansing the data. This would constitute the bulk of the work.
One person assumed the mantle of ‘steward’ (nothing short of a sacred trust) throughout the program. Our Steward was a data engineer who quickly learned how each of the data sets were used by the business by helping the business stakeholders who would contribute, curate and consume the data. Our Steward was tasked with coordinating the consolidation and remediation of the data on their behalf. We published our methodology up front to set expectations and then worked our way through it by preparing lists, reporting progress metrics, and sourcing suspected duplicates and errors to the appropriate data producers for their help in resolution.
Once the ‘correct’ data was identified through this process, we did two things. Firstly, we made changes in the source systems to true up the data with the new master record. This act was logged for traceability should there be any unforeseen issues later. Secondly, we implemented a master data hub that would source the master data to downstream systems that needed it, both by exposing endpoints for credentialed ad hoc use as well as syndication to automate delivery for operational systems. Weekly meetings were necessary to establish a cadence that would maintain organizational focus and engagement of the primary business data users through the program duration.
The benefits to the company would reveal themselves in time, specifically that master data that was managed could be consumed more readily, with less time spent on quality control and refactoring. Master data management rendered source data clean, dependable and easily joinable with other sources of operational and transactional data. And the process was verifiable and constantly improving. Increased adoption of this master data would be used to base-line a growing number of operational improvement efforts, as well as enabling new ones. The transparent and repeatable process proved trustworthy, increasing the value of the data by reducing the cost and risk of using it.
Bringing it all together
We employed ‘by the book’ master data management techniques throughout this process and were successful. By using Business Architecture to help us decide where to focus our efforts, we landed on a point of view that balanced the strategic versus tactical needs of the enterprise. It provided the answerability to every question about ‘why this’ among all others and ‘why now.’ Once clarified and codified in a charter, we incrementally enlisted business stakeholders to sponsor and contribute resources to mastering these business objects’ data and applying institutional controls to protect these investments so they would scale with the company. Our approach endowed us with insights early on, helping our stakeholders to achieve and advocate a defensible point of view that endured execution through to the adoption process.