Knowledge Management for Master Data Policy, Standards and Controls

September 9, 2014

Policies, Standards and Controls: 

 

A hierarchy of information for Master Data Knowledge Management

 

While your company may use different words to describe each of these layers, in one way or another you are probably thinking about your data accuracy management in a hierarchical way. Approaching data management in this manner will help to structure your documentation and approval accordingly.

 

I have come to this conclusion after working with several large and complex multi-national companies struggling to convey a thorough understanding of what data quality means to different groups throughout an organization. Each group considers data quality within a different context, at perhaps a singular layer of the hierarchy, and in doing so, can miss the bigger picture.

 

The Base:

So, starting from the bottom, the controls, often called the business rules by the IT personnel, are very granular logic tests that one can use to ensure correct data entry or to monitor the entire database over time. These are required to produce the quality reports in whatever DQ tool you are using. One example could be “…for raw materials the base unit of measure for materials in material group “X” must be grams if the goods issue unit is grams...”.

 

 

This control is absolutely-testable but very granular and sheds no light on the “whys and what ifs”. It is purely a QC test and for any given element you may have many of these. The documentation for this test will still contain the essentials of QC test methods, ownership, etc. However, in complex organizations with many complex controls/rules/QC tests, managing the overall strategy can be very difficult. 

 

Trying to manage the knowledge only from this level is one reason companies have issues implementing DQ monitoring tools. If you try and collect all of the rules/controls without having a standard, the tools outputs will be full of false positive and false negative results.

 

The Middle:

 

The Standard contains the guiding principles for the use of this field. 

 

The Standard document is “everything about the element” including the overall strategy for how that element is used, when it is required to be used (and when not), standard ownership and data accuracy ownership (often different functions), and so forth. For the above example the Standard for Unit of Measure will define not just the allowed values for every variation (raw materials, intermediates, packaging, finished goods and indirect materials) but also the guiding principles employed to determine the correct values. 

 

There may be exceptions to the “Metric UoM Only” rule for raw materials for one plant in one country and for very good reasons. If those reasons are not understood, then the material using the English Base Unit looks like a duplicate (it is, but may be a justified one). I am not promoting exceptions, only acknowledging that they exist and that there must be knowledge to support the reasoning for the exception. This knowledge must be documented centrally so that when conditions change, the data (and the controls) can be changed. If the overall use and strategy for an element is documented and transparent for all to see, then change management is far easier to implement. 

 

Even when there are multiple interdependencies, some level of knowledge will always need to be documented and managed at this element level. Linking up to the policy level can provide the broader context and simpler maintenance of the business operation strategy for data families.

 

Every element in your Master Data should have a standard even if that standard says, “Do not use this element/field”. 

 

Top level:

 

Policies are where groups of data elements link to the business operating strategy. Policies are not required for each field/data element in your master data beyond the basic overall data policies for security and transparency and so forth, but there is the need to manage the knowledge for certain families of data elements with strong interactions in a holistic way. 

 

When defining the strategy for addresses, for instance, managing all the address fields in a single place and distributing it to the individual elements provides a more comprehensive view of the why things are dons a certain way in an easier to digest format than reviewing individual elements. If you define the use of Name 1-4 in one place it’s easier to see the rational. If name 4 is always dedicated to the “in-care-of” information, then defining that the “in-care-of” data element is never used or vice versa in a policy common to the Name 1-4 fields. In the end, it matters less how you manage "In-care-of" information in the address than ensuring that everyone knows to handle it the same way.

 

Consider a policy for MRP Field management where the interdependencies of several data elements are required for optimizing inventory management. Another data family with highly interdependent operational strategies is for all the Units of Measure (UoM) over all aspects from Base UoM to production UoM and so forth. The standards are all interconnected, so managing the strategy for all of them at the policy level makes more sense than in isolation. This does not eliminate the need for the data element standard; it just provides a better way to manage the strategy knowledge for the group.

 

Leaves to Trees to the Forest:

 

Another way of thinking about this is if you view the control as leaf, the standard as a tree and policy as one acre of the total many acre data forest then you can appreciate that it is critical to see each layer at the proper detail at the proper time. 

 

They each contain critical knowledge to manage to lower the risk of knowledge loss for your enterprise, but they cannot me documented and managed in isolation either as redundancy of effort and inconsistent and even conflicting direction of your data strategy can (will) result.

 

Finally:

 

Small companies with very stable data managers and clear transitions plans never need to worry about managing the knowledge for master data this comprehensively, but in dynamic organizations with high personnel mobility who are implementing mergers, acquisitions, divestitures and new business processes and platforms, keeping the knowledge organized and not just in the heads of a few power users is critical for success.

Knitting these layers together is easier than it sounds, is of lower cost than you think and far less expensive than letting the knowledge slip away only to need to invest to re-create it later. 

 

What are your thoughts?

 

Richard A. King

 

DataIntent.com

Share on Facebook
Share on Twitter
Please reload

Featured Posts

Knowledge Management and the Master Data Steward

November 17, 2014

1/2
Please reload

Recent Posts

January 16, 2017

Please reload

Archive
Please reload

Search By Tags
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square