0% found this document useful (0 votes)
92 views

BPM Masterdata

Business process modelling (BPM) aims to improve organizational efficiency and quality by mapping and analyzing connecting activities involved in providing a product or service. BPM creates diagrams, called business process models, to represent the sequence of activities from beginning to end. The goal is to optimize efficiency by reducing wasted time and improving customer experience. BPM considers both IT and human processes and can involve mapping activities within and outside an organization.

Uploaded by

Jignesh Hariya
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
92 views

BPM Masterdata

Business process modelling (BPM) aims to improve organizational efficiency and quality by mapping and analyzing connecting activities involved in providing a product or service. BPM creates diagrams, called business process models, to represent the sequence of activities from beginning to end. The goal is to optimize efficiency by reducing wasted time and improving customer experience. BPM considers both IT and human processes and can involve mapping activities within and outside an organization.

Uploaded by

Jignesh Hariya
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 25

business process modelling business process modelling explanation - diagrams, definitions, examples Business Process Modelling (BPM) is a modern

term and methodology which has evolved through different stages and names, beginning during the 'division of labour' of the late 1700s, when manufacturing first moved into factories from cottage industry. More explanation is in the historical development of Business Process Modelling below. Broadly the term 'business' in Business Process Model/Modelling/modeling is interchangeable with 'organisation'. Business Process Modelling is not only carried out in conventional businesses; the methodology is increasingly applicable to all sorts of other organisations, for example government agencies and departments, charities, mutuals and cooperatives, etc. Confusingly, the acronym BPM can mean different things, some closely related to Business Process Modelling; others less so. 'Business Process Management' is an example of a different and related meaning. More details are in the glossary below. Business Process Modelling is a method for improving organisational efficiency and quality. Its beginnings were in capital/profit-led business, but the methodology is applicable to any organised activity. The increasing transparency and accountability of all organisations , including public service and government, together with the modern complexity, penetration and importance of ITC (information and communications technology), for even very small organisations nowadays, has tended to heighten demand for process improvement everywhere. This means that Business Process Modelling is arguably more widely relevant than say Time and Motion Study or Total Quality Management (to name two earlier 'efficiency methodologies') were in times gone by. Put simply Business Process Modelling aims to improve business performance by optimising the efficiency of connecting activities in the provision of a product or service. Business Process Modelling techniques are concerned with 'mapping' and 'workflow' to enable understanding, analysis and positive change. Diagrams - essentially 'flow diagrams' - are a central feature of the methodology. The diagrammatical representation of Business Process Modelling is commonly called 'notation'. Many and various proprietary software (off-the-shelf computer programs) exist to enable this, but the basic principles of Business Process Modelling can also be applied using a pen and a table-napkin or a flip-chart or a bunch of sticky notes, and in some cases these are still effective aids for creating and communicating fundamental ideas. Computers sometimes get in the way, over-complicate simple things, and exclude groups. So choose your devices wisely. Business Process Modelling generally needs support from people to work in practice. While Business Process Modelling relates to many aspects of management (business, organisation, profit, change, projects, etc) its detailed technical nature and process-emphasis link it closely with quality management and the analytical approaches and responsibilities arising in the improvement of quality. Business Process Modelling is a quality management tool, like for example Six Sigma, and is useful especially in change management. SWOT Analysis, Balanced Scorecard and Project Management methods provide further examples of change management tools, and Business Process Modelling can be regarded as working alongside these methods. The term Business Process Model (also abbreviated to BPM) is the noun form of Business Process Modelling, and refers to a structural representation, description or diagram, which defines a specified flow of activities in a particular business or organisational unit.

(N.B. US-English spelling is 'organization'. 'Organisation' spelling is UK-English, and used in this article, including some other ise/ize and our/or words like labour and colour. Modelling is UK-English spelling. Modeling is US-English. )

business process modelling index definition and purpose a note about sequence background and history modelling a business process - how to - overview creating a new business process model - how to example business process model glossary of main BPM terms

business process modelling definition A Business Process Model (BPM) is commonly a diagram representing a sequence of activities. It typically shows events, actions and links or connection points, in the sequence from end to end. Sequence is significant and essential to most aspects of business process modelling, but there are exceptions to this especially at the higher level of organizational operations (see the note about sequence). Typically but not necessarily, a Business Process Model includes both IT processes and people processes. Business Process Modelling by implication focuses on processes, actions and activities, etc. Resources feature within BPM in terms of how they are processed. People (teams, departments, etc) feature in BPM in terms of what they do, to what, and usually when and for what reasons, especially when different possibilities or options exist, as in a flow diagram. Business Process Modelling is cross-functional, usually combining the work and documentation of more than one department in the organisation. In more complicated situations, Business Process Modelling may also include activities of external organisations' processes and systems that feed into the primary process. In large organisations operations Business Process Models tend to be analysed and represented in more detail than in small organisations, due to scale and complexity. Business Process Modelling is to an extent also defined by the various computerized tools or software which are used in applying its methods. These methods and the standard features within them continue to evolve, which means that we should keep an open and curious mind as to how BPM can be used, and what people actually mean when they refer to it.

purpose of business process modelling

A Business Process Model diagram is a tool - a means to an end, not a performance outcome in its own right. The final output is improvement in the way that the business process works. The focus of the improvements is on 'value added' actions that make the customer service and experience better, and on reducing wasted time and effort. There are two main different types of Business Process Models: the 'as is' or baseline model (the current situation) and the 'to be' model (the intended new situation) which are used to analyse, test, implement and improve the process. The aim of modelling is to illustrate a complete process, enabling managers, consultants and staff to improve the flow and streamline the process. The outcomes of a business process modelling project are essentially: value for the customer, and reduced costs for the company, leading to increased profits. Other secondary consequences arising from successful Business Process Modelling can be increased competitive advantage, market growth, and better staff morale and retention. There are no absolute rules for the scope or extent of a Business Process Model in terms of departments and activities covered. Before committing lots of resources to Business Process Modelling proper consideration should be given to the usefulness and focus of the exercise - ask the questions: Does the modelling have the potential to produce gains that will justify the time and effort? Will the modelling be structured so that people will understand the outputs (not too big and complex as to be self-defeating)? Do people understand why we are doing it, and "what's in it for them"? As with other management tools, there is no point producing a fantastically complex model that no-one can understand or use, just as it is a bit daft to spend hundreds of hours analysing anything which is of relatively minor significance. Business Process Modelling is a powerful methodology when directed towards operations which can benefit from improvement, and when people involved are on-board and supportive.

adding value for the customer Adding value for customers, whether internal or external customers, is at the centre of a Business Process Model. It starts with a customer need and ends with the satisfaction of that need. Unlike a workflow diagram, which is generally focused on departmental activities, a BPM spans departments and the whole organisation.

This point about customers being internal as well as external is crucial: Staff are among the internal customers of modern right-minded organisations. If you approach Business Process Modelling purely from a systems and 'things' viewpoint with a fixation on costs and profitability, and squeezing every activity to its theoretical optimum, then people (notably staff) tend to get squeezed too. Organisations work well when people enjoy and support the processes that they are required to perform, and you will only add sustainable value for your customers, when you also add value for your staff too. Successful BPM added value for customers is self-sustaining because for staff it contains the magical WIIFM element - (What's In It For Me). example - BPM added value An example could be the actions involved in processing a customer order from an internet-based mail order company. Starting with a customer placing an order (the customer need) send IT-based information to the warehouse stock picking packing and recording sending the appropriate IT-based information to the distribution hub sending IT-based information to the accounts department generation of an invoice allocation and organisation of shipment for the vehicle drivers delivery of the item and invoicing (the customer need fulfilled). This is a simple 'high-level' example. In practice each part or sub-process (for example, stock-picking) may require a 'low-level' BPM of its own. Please note that: Added value for internal customers, notably staff, does not have to be financial, as is commonly imagined by many top business executives. Consider Maslow, Herzberg, McGregor, and Adams and what these concepts teach about motivation and reward, and attrition. Business Process Modelling has enormous potential to address many of the critical demotivators among staff (e.g., poor working relationships, confused structure, failure, etc) and also many strong motivators (e.g., the quality of work itself, recognition, advancement, new responsibilities, etc). But it needs thinking about or it won't happen Think beyond merely adding value for external customers, and optimising efficiency and profit - make a special effort to look for added value for staff too, and then the BPM methodology will work on a much more effective level.

sequence - significance in business process modelling

Sequence can have a pivotal influence on business process activities, but sequence is not always pivotal, and indeed certain situations are best analysed from a non-sequential viewpoint. As a general guide, sequence is usually vital for elemental processes, but sequence tends to become less significant - and require more 'cause and effect' flexibility - when elements such as already sequenced processes and resources are brought together in a bigger picture of organizational operations. This Business Process Modelling summary necessarily concentrates on modelling systems which can be defined using sequence based techniques, since at an elemental level sequence is crucial to quality and related factors of process, quality, monitoring, management, and change, etc. Also, at an elemental level, i.e., when a big activity is broken down into its constituent parts, sequence can have a vital effect upon the effectiveness of each of the individual processes. However a wider consideration is that many large scale systems commonly contain related processes and resources for which a fixed related sequence is not a specific or crucial or predictable aspect, and for which consequently it is not always possible or easy (or in many cases necessary) to define the exact sequential relationship of processes on a big systemic scale. An example could be the bringing together of separate sub-assemblies, or the buying in of stock, or the recruitment of sub-contract staff. This is especially so where demand is unpredictable. Importantly for these sorts of related process, a bigger priority is to focus on understanding and creating the necessary flexible connections between cause and effect relationships (see and make use of other project management tools for analysing and improving non-sequential elements), rather than try to force a fixed sequence into the analysis or modelling approach. As with many other tools and methodologies, be mindful of the need for flexibility; use tools and methods as far as they are helpful, but do not blindly force a tool to fit your purposes if it is inappropriate or could distort common sense, or be too constraining, whether for planning, analysis, communications or implementation. Sequence is always an important consideration, especially when trouble-shooting. Sometimes - at any level - it can be the key to finding dramatic improvements, but sequence is not a mandatory feature, and there is no need to search for and apply sequential conditions within any stage of process modelling or other type of project or change management, if doing so is unhelpful.

business process modelling - background and history The quest for standardisation and efficiency in business processes is a long one. Its history is characterised by surges of enthusiasm followed by disillusionment, when the fashionable idea of the moment tends to be dumped for a while before the next generation of process efficiency methodology makes the subject more exciting again. Well, as exciting as business processes can be. CEOs, consultants and change managers get all fired up about an improvement push (mainly about profits and change and fees). And there lies the central problem: the people who actually put process improvements into practice have never been that excited by the concept. The explicit agenda is about the employer and organisation while the benefit for ordinary employees is not immediately obvious, if at all. For most staff, a new efficiency initiative looks like change, hard work and discomfort, and feels like a threat. Instead workers ideally need to be engaged, involved and included from the start. Like other topdown initiatives and trends over the years, the most common reason for the failure of business process improvement is generally poor internal marketing, poor implementation and poor follow-through. A theoretical model for success devised among senior managers, rarely looks like the same thing further down the organisation. origins of business process modelling

The origins of BPM principles can be traced back as far as Adam Smith's idea of the division of labour in manufacturing (in 'An Inquiry into the Nature and causes of the Wealth of Nations', 1776). Originally, one person would make one item from beginning to end in a cottage industry situation. When factories became the norm, employing many people who all made items from beginning to end proved timeconsuming and inefficient. specialisation - 'division of labour' - 1776 Using the example of a pin maker, Adam Smith argued that breaking up the whole process and creating specialised tasks (or peculiar tasks, as he called them) would simplify and speed up the whole process. He showed that if the different stages of the manufacture were completed by different people in a chain of activities, the result would be very much more efficient. The business process was born. analysis of specialised tasks - 'time and motion' - early 1900s Over a century later, Frederick Winslow Taylor (1856-1915) the US engineer and business efficiency theorist moved thinking forward, merging his 'time study' with the 'motion study' work of Frank and Lillian Gilbreth (early US theorists on productivity and workplace science), resulting in new scientific management methods (1911) and the infamous 'time and motion' studies. These studies documented and analysed work processes with the aim of reducing the time taken and the number of actions involved in each process, improving both productivity and workers' efficiency. This was enthusiastically embraced by employers and viewed with scepticism and animosity by workers. The term 'Taylorism' still generally refers to a highly scientific and dehumanised approach to efficient operation in business, organisations, economies, etc. work process flow - 'the one best way' - early to mid- 1900s Meanwhile, Frank Gilbreth was busy developing the first method for documenting process flow. He presented his paper 'Process charts - First Steps to Finding the One Best Way' to the American Society for Mechanical Engineers (ASME) in 1921. By 1947, the ASME Standard for Process Charts was universally adopted, using Gilbreth's original notation. disenchantment with the assembly line - 1930s In the first decade of the 20th century, 'time and motion' was a familiar concept, in tune with the modern 'scientific' age. However, by 1936, disenchantment had set in, reflected in Charlie Chaplin's film Modern Times. The film satirised mass production and the assembly line, echoing cultural disillusionment with the dreary treadmill of industry during the great depression. It is perhaps no coincidence that theories for optimising productivity, and those who profit most from them, are more strongly questioned or criticised when the economic cycle moves into recession. workflow - mid 1970s Research and development of office automation flourished between 1975 and 1985. Specialist workflow technologies and the term 'workflow' were established. While BPM has its historical origins in workflow, there are two key differences: Document-based processes performed by people are the focus of workflow systems, while BPM focuses on both people and system processes.

Workflow is concerned with processes within a department while BPM addresses processes spanning the whole organisation. the quality era - 1980s In the 1980s, Quality or Total Quality Management (TQM) was the fashionable management and business process theory, championed by Deming and Juran. Used initially in engineering and manufacturing, it is based on the Japanese philosophy of Kaizen or continuous improvement. The aim was to achieve incremental improvements to processes of cost, quality, service and speed. Key aspects of Total Quality Management have now become mainstream and successfully adapted to suit the businesses of the 2000s. Six Sigma and Lean manufacturing are the best-known of these methodologies. business process re-engineering (BPR) - 1990s In the early 1990s, Business Process Re-engineering made its appearance and started to gain momentum in the business community. While TQM (at this point facing a decline in popularity) aimed to improve business processes incrementally, BPR demanded radical change to business processes and performance. In 1993, Michael Hammer (US professor of computer science) and James Champy (a successful corporate CEO, consultant and author) developed the concept in their book 'Re-engineering the Corporation: A Manifesto for Business Revolution' (1993). Hammer and Champy stated that the process was revolutionary, fast-track and drastic rather than evolutionary and incremental. It was a huge success and organisations and consultants embraced it with fervour. The re-engineering industry grew and triumphed before it began to wane. By the end of the 1990s, BPR as a whole-organisation approach had fallen dramatically out of favour. It proved to be too long-winded for most organisations, was therefore poorly executed and has consequently been sidelined as a whole-organisation approach. Critics of this completely 'new broom' methodology would say that it is impossible to start from a clean slate in an already established organisation. Other criticisms were that it was dehumanising and mechanistic, focusing on actions rather than people - Taylorism by another name. Crucially, it is associated with the terms 'delayering', 'restructuring' and 'downsizing' of organisations, all lumped together as euphemisms for layoffs. Not what Hammer and Champy had envisaged.. business process modelling - 2000s The best principles of this approach still survive in BPM, on a less drastic, less brutal and more manageable scale. Lessons have been learnt. Business Process Modelling can and does work, but it must be treated with caution. The key is in the implementation. When it is conducted and implemented sensitively and inclusively, it can be good for the company, and its staff too. For a workforce drowning in administration, much of it repeated or re-entered into multiple databases, BPM can be a great thing. It can free up time to focus on the 'value added' tasks that are empowering and rewarding: talking and listening to customers, making decisions or doing what they are good at rather than dealing with dull and meaningless duties. BPM is effective like any other tool can be. In the hands of an idiot BPM can suffocate and hinder an organisation and its people. The tool doesn't produce the results - what matters is how you use it.

modelling a business process - an overview

A Business Process Model is central to a host of other related activities, briefly outlined below. Redesigning a process and implementing it is not a speedy enterprise. It can take months and occasionally years, depending on the extent of the process and sub-processes, how many people and systems are involved and how much of it needs to be redesigned. As the project develops, the business will change and new requirements will surface, so the approach must be flexible and frequently reviewed and re-prioritised. It is advisable to stage the process in a succession of 'builds', each one completed within a business quarter, so that it can be reviewed and measured for return on investment. It is essential that the people involved in the process, at all levels, are engaged, all the way through. Not only because their input is vital, but also because they need to be fully 'on board.' Senior management buy-in is important to ensure that the resources are available to involve managers and staff members and overcome any resistance to the change. Without this, the redesign cannot work. Focus groups, formal and informal discussions and workshops are useful at each stage and 'build'. stages in the development of the modelling project Developing the models in practice follows the sequence: Identify the process and produce an 'as is' or baseline model. Review, analyse and update the 'as is' process model. Design the 'to be' model. Test and implement the 'to be'. Continuously update and improve the new model.

creating a business process model This section provides a guide to creating an initial, 'as is' or baseline model, in other words - the current situation. component parts of a BPM An 'as is' or baseline model gives an overall picture of how the process works, now. Any structural, organisational and technological weak points and bottlenecks can then be identified, along with possible improvements at the next stage. You will need the following information before you start to construct your model: The desired outcome of the process. The start and end points (customer need and customer need fulfilment). The activities that are performed. The order of activities. The people who perform the activities. The documents and forms used and exchanged between functions and from customers and suppliers.

first draft The first draft of the model will involve a lot of positioning and repositioning of events and activities, so make sure you use a method that is flexible and easily changed. Use a flipchart, pens and some sticky notes or a whiteboard and a rubber. If you're working with a group of users, everyone needs to be able to see it. second draft Once you have established an agreed sequence of events, you can create it as a flowchart on generic software or on specialised proprietary software. At this stage, you will need to check your model with the users by carrying out 'live' observations of the sequence in practice. People in focus groups or meetings invariably either forget their exact actions or say what should be happening rather than what does happen! symbols and notation The diagrammatical representation of Business Process Modelling is commonly 'notation'. If you are using generic software, decide on the visual symbols you will use for the different activities. There is no definitive system for Business Process Modelling notation (note the small 'n'), although efforts persist to standardise one. The Business Process Modelling Notation (BPMN) system (note the upper-case 'N', since this is like a brand name), is an example of an attempt to establish a standard BPM notation system. The BPMN system is maintained the OMG consortium (Object Management Group) which comprises a few hundred computer-related corporations). Organisations may develop their own notation systems or use the notation of their chosen proprietary software. Importantly - whatever notation system/software you use - its symbols must be understood within your own group or organisation. The example below uses four symbols that are widely understood: IT-based activity - documentation, sending or requesting information, for example) Decision point or Gateway - where a decision has to be made and the flow can go more than one way.

Action - to be carried out by a person in the organisation.

Event - an action or IT-based activity from an external source or carried out by the customer.

In the example model diagram below the customer service desk, accounts and shipping departments are shown as icons and text. There are many other symbols used in BPM notation. Below is just a simple example to illustrate the technique.

example of a business process model diagram This example of a Business Process Model diagram is based on an online order-through-to-delivery process. It's a flowchart, which makes it very easy to see the process and the key elements within it. Here is a better quality picture of the same Business Process Model diagram example in PDF format. Note that while most Business Process Modelling diagrams necessarily include a strong sequencing dimension, there are circumstances where sequence is not so crucial and more of a 'mapping' perspective is appropriate. Refer to the note about sequence, and see other project management tools for examples of analysis methods which enable non-sequential activity 'mapping'.

business process model glossary - a few selected terms and acronyms Different organisations refer to the elements related to Business Process Modelling in different ways. This is complicated by the fact that acronyms can stand for more than one thing, and are often used interchangeably. For example, BPR stands for Business Process Re-engineering as well as Business Process Redesign. BPM itself stands for Business Process Modelling/Modeling, Business Process Model, and Business Process Management. In the healthcare sector incidentally BPM would more readily be interpreted to mean Beats Per Minute, relating to pulse rate, which emphasises the need to explain acronyms when you use them. Organisations develop their own ways of referring to the different elements. They know what they mean, but someone from another organisation could become very confused! This glossary may help anyone feeling lost. Broadly the term 'business' below is interchangeable with 'organisation'. Business Process Modelling is not only carried out in conventional businesses; the methodology is increasingly applicable to all sorts of other organisations, for example government agencies and departments, charities, mutuals and cooperatives, etc. 'As is' and 'To be' models - The common two perspectives of a modelling exercise - Where are we now?, and Where do we want to be? The 'as is' or baseline model is an accurate depiction of what actually happens now. Once the model is developed, it is used to analyse and improve the process. The 'to be' model is a proposed diagram of how the future process could look, incorporating improvements. This is used to demonstrate, model and test the new process and then to implement it. Brainstorming - Not usually part of BPM technical language, but actually a very useful initial stage in mapping or attempting to represent/understand/agree/scope a BPM project given little or no information to begin with. Also a useful way to achieve essential involvement, input, support, etc., from people affected by the modelling exercise. See Brainstorming. Business Architecture - A vague and widely used term basically referring to the structure of a business. Business Model - A vague term used to refer to how a business aims to operate and make money in a given market. This term is not directly related to Business Process Modelling. A detailed business model might typically contain descriptions of basic business processes implied or necessary for the the model to operate, but a business model is mainly concerned with strategy and external market relationships, rather than the internal processes which feature in BPM. Business Process - A structured series of work activities, IT interventions and events that generate one complete service or product for an organisation's customers. Business Process Model - A representation - usually computer-generated and diagrammatic, but can be a low-tech whiteboard or flipchart and marker pens and sticky notes - of a process within a business. Two models are usually produced: an 'as is' and a 'to be'. The process(es) featured in a Business Process Model can be very simple or highly complex, and will typically involve different departments working (hopefully) together while the provision or creation of a product or service flows through different stages and decision-points in an organisation on its way to the customer. A Business Process Model for a large process can be comprised of other smaller modelled processes which contribute to the whole. In theory

an entire huge business can be modelled, although for the modelling to be useful and meaningful to people it is normally built in sections, each representing a self-contained process alongside potentially scores, hundreds or even thousands of others, all inter-relating, hopefully smoothly, efficiently and enjoyably. (The 'enjoyable' part is not a technical necessity, but is actually important for any model to translate from theory into sustainable practice.) Business Process Modelling/Business Process Modeling - The term which refers to the methodology and techniques of producing a Business Process Model, or several Business Process Models, in the course of business improvement/development or quality management or change management, etc. (Modelling is UK-English; Modeling is US-English.) Business Process Change Cycle - An overall term for the life cycle of business processes, including the external environment in which the organisation operates. This external environment drives change in the business processes and the organisation responds to it by adjusting its strategy and goals. As the external environment keeps changing, the cycle also changes, prompting continuous change and improvement to business processes. Business Reference Model - A (usually computerised and diagrammatic) key to understanding and using a Business Architecture Model. It presents certain core structural elements as fixed, thereby encouraging and enabling others using or developing the model to understand and adhere to essential aspects of structural policy and foundation. In this respect a Business Reference model may be relevant to Business Process Modelling BPR - Business Process Re-engineering (also known as BPI - Business Process Innovation ) - A radical approach to restructuring an organisation in every area, starting with what the organisation is trying to achieve, rethinking its core processes and redesigning every one. It is a way of reassessing and restructuring the whole organisation, all at once, starting from scratch. BPI - Business Process Improvement - This refers to improving existing processes, continuously and incrementally, reducing waste and driving efficiency. Six Sigma is currently one of the most popular of many BPI approaches in use today. BPM - Business Process Management - Used in two different ways by two different groups within the business processing community: Firstly, it is used by the people and process management group to describe the overall management of business process improvement, aligning processes with an organisation's strategic goals: designing, implementing and measuring them and educating managers to use them effectively. Secondly, it is used by IT people to describe the systems, software and applications that provide the process framework. BPM - Business Process Mapping - Often used interchangeably with Business Process Modelling, Business Process Mapping is also used to mean documenting all the processes in the business, showing the relationships between them. This provides a comprehensive visual overview of the processes in an organisation. BPM - Business Process Model/Business Process Modelling - See Business Process Model/Modelling above. BPMN - Business Process Modeling Notation - A 'branded' Business Process Modelling notation system of the OMG Consortium, representing several hundred primarily US computer-related corporations. (UK-English 'Modelling'.) BPR - Business Process Redesign - Rethinking, redesigning and implementing one complete process using Business Process Modelling tools.

Enterprise Architecture - Basically the same as Business Architecture. Enterprise is a relatively modern term for a business organisation or company than 'business', probably because business has quite specific associations with profit and shareholders, whereas the word enterprise can more loosely encompass all sorts of business-like activities which might be constituted according to mutual or cooperative rather than traditional capitalistic aims. Enterprise is also a popular way to refer to business development and entrepreneurial creativity. The relevance of all this to BPM is merely the use of the word enterprise in BPM terminology, where previously the word 'business' would have been used. Gateway - A stage in a Business Process Model diagram or notation at which decision or choice is made because more than one main option or outcome exists. Notation - The technical term for a Business Process Model diagram or computer-generated map or flowchart. OBASHI - A methodology, and related aspect of BPM, for mapping and developing how IT systems relate to organisational operations (OBASHI stands for Ownership, Business Processes, Applications, Systems, Hardware, and Infrastructure). UML - Unified Modeling Language - a visual representation/design system for software-led modelling, overseen by the OMG Consortium (as with BPMN). What if? (scenario) - A popular term given to discussion or modelling of possible shapes, structures, resourcing and any other options that are available to people considering change in businesses and organisations. The 'What if?' principle extends far beyond Business Processes, but is a useful technique in team-working and attempting to make BPM methods more consultative and involving. This is especially important given that the nature of BPM (the computer systems, terminology, highly detailed aspects) often tend to position the methods as a lone job away from people and groups affected by its implications and opportunities. BPM works best when people are involved - considering questions like 'What if?' - and often fails when it guarded and developed secretively by technocrat minority. Value- added/Added- value - The principle of increasing the usefulness, attractiveness and benefits of a product or service, which in the context of BPM, ideally improves progressively with each modelling exercise. Added-value is commonly represented as benefiting customers and shareholders (via reduced costs, and increased efficiencies and profits) but should also benefit staff/employees too. Zachman Framework - In this list mainly because an interesting listing under the letter Z is irresistible. This is a computerised diagrammatical notation system for representing an enterprise (or business or other organisation) - notably its 'enterprise architecture'. It was devised by John Zachman, a US ex-naval officer computer scientist, while working for IBM in the 1980s.

Master data:

Introduction
The pain that organizations are experiencing around consistent reporting, regulatory compliance, strong interest in Service-Oriented Architecture (SOA), and Software as a Service (SaaS) has prompted a great deal of interest in Master Data Management (MDM). This paper explains what MDM is, why it is important, and how to manage it, while identifying some of the key MDM management patterns and best practices that are emerging. This paper is a high-level treatment of the problem space. In subsequent papers, we will drill down into the technical and procedural issues involved in Master Data Management.

What Is Master Data?


Most software systems have lists of data that are shared and used by several of the applications that make up the system. For example, a typical ERP system as a minimum will have a Customer Master, an Item Master, and an Account Master. This master data is often one of the key assets of a company. It's not unusual for a company to be acquired primarily for access to its Customer Master data.

Rudimentary Definitions
There are some very well-understood and easily identified master-data items, such as "customer" and "product." In fact, many define master data by simply reciting a commonly agreed upon master-data item list, such as: customer, product, location, employee, and asset. But how you identify elements of data that should be managed by a master-data management system is much more complex and defies such rudimentary definitions. In fact, there is a lot of confusion around what master data is and how it is qualified, necessitating a more comprehensive treatment. There are essentially five types of data in corporations:

UnstructuredThis is data found in e-mail, white papers like this, magazine articles, corporate intranet portals, product specifications, marketing collateral, and PDF files. TransactionalThis is data related to sales, deliveries, invoices, trouble tickets, claims, and other monetary and non-monetary interactions. MetadataThis is data about other data and may reside in a formal repository or in various other forms such as XML documents, report definitions, column descriptions in a database, log files, connections, and configuration files. HierarchicalHierarchical data stores the relationships between other data. It may be stored as part of an accounting system or separately as descriptions of real-world relationships, such as company organizational structures or product lines. Hierarchical data is sometimes considered a super MDM domain, because it is critical to understanding and sometimes discovering the relationships between master data. MasterMaster data are the critical nouns of a business and fall generally into four groupings: people, things, places, and concepts. Further categorizations within those groupings are called subject areas, domain areas, or entity types. For example, within people, there are customer, employee, and salesperson. Within things, there are product, part, store, and asset. Within concepts, there are things like contract, warrantee, and licenses. Finally, within places, there are office locations and geographic divisions. Some of these domain areas may be further divided. Customer may be further segmented, based on incentives and history. A company may have normal customers, as well as premiere and executive customers. Product may be further segmented by sector and industry. The requirements, life cycle, and CRUD cycle for a product in the Consumer Packaged Goods (CPG) sector is likely very different from those of the clothing industry. The granularity of

domains is essentially determined by the magnitude of differences between the attributes of the entities within them.

Deciding What to Manage


While identifying master data entities is pretty straightforward, not all data that fits the definition for master data should necessarily be managed as such. This paper narrows the definition of master data to the following criteria, all of which should be considered together when deciding if a given entity should be treated as master data.

Behavior
Master data can be described by the way that it interacts with other data. For example, in transaction systems, master data is almost always involved with transactional data. A customer buys a product. A vendor sells a part, and a partner delivers a crate of materials to a location. An employee is hierarchically related to their manager, who reports up through a manager (another employee). A product may be a part of multiple hierarchies describing their placement within a store. This relationship between master data and transactional data may be fundamentally viewed as a noun/verb relationship. Transactional data capture the verbs, such as sale, delivery, purchase, email, and revocation; master data are the nouns. This is the same relationship data-warehouse facts and dimensions share.

Life Cycle
Master data can be described by the way that it is created, read, updated, deleted, and searched. This life cycle is called the CRUD cycle and is different for different master-data element types and companies. For example, how a customer is created depends largely upon a company's business rules, industry segment, and data systems. One company may have multiple customer-creation vectors, such as through the Internet, directly through account representatives, or through outlet stores. Another company may only allow customers to be created through direct contact over the phone with its call center. Further, how a customer element is created is certainly different from how a vendor element is created. The following table illustrates the differing CRUD cycles for four common master-data subject areas. Sample CRUD cycle

Customer
Create Customer visit, such as to Web site or facility; account created

Product
Product purchased or manufactured; SCM involvement

Asset
Unit acquired by opening a PO; approval process necessary

Employee
HR hires, numerous forms, orientation, benefits selection, asset allocations, office assignments

Read

Contextualized views Periodic inventory based on credentials of viewer catalogues

Periodic reporting purposes, figuring depreciation, verification

Office access, reviews, insurance-claims, immigration

Updat Address, discounts,

Packaging changes, Transfers,

Immigration status,

phone number, preferences, credit accounts

raw materials changes

maintenance, accident reports

marriage status, level increase, raises, transfers

Destro Death, bankruptcy, y liquidation, do-notcall. Search CRM system, callcenter system, contact-management system

Canceled, replaced, Obsolete, sold, no longer available destroyed, stolen, scrapped ERP system, orders- GL tracking, asset processing system DB management

Termination, death

HR LOB system

Cardinality
As cardinality (the number of elements in a set) decreases, the likelihood of an element being treated as a master-data elementeven a commonly accepted subject area, such as customer decreases. For example, if a company has only three customers, most likely they would not consider those customers master dataat least, not in the context of supporting them with a master-data management solution, simply because there is no benefit to managing those customers with a master-data infrastructure. Yet, a company with thousands of customers would consider Customer an important subject area, because of the concomitant issues and benefits around managing such a large set of entities. The customer value to each of these companies is the same. Both rely upon their customers for business. One needs a customer master-data solution; the other does not. Cardinality does not change the classification of a given entity type; however, the importance of having a solution for managing an entity type increases as the cardinality of the entity type increases.

Lifetime
Master data tends to be less volatile than transactional data. As it becomes more volatile, it typically is considered more transactional. For example, some might consider "contract" a masterdata element. Others might consider it a transaction. Depending on the lifespan of a contract, it can go either way. An agency promoting professional athletes might consider their contracts as master data. Each is different from the other and typically has a lifetime of greater than a year. It may be tempting to simply have one master-data item called "athlete." However, athletes tend to have more than one contract at any given time: one with their teams and others with companies for endorsing products. The agency would need to manage all those contracts over time, as elements of the contract are renegotiated or athletes traded. Other contractsfor example, contracts for detailing cars or painting a houseare more like a transaction. They are one-time, short-lived agreements to provide services for payment and are typically fulfilled and destroyed within hours.

Complexity
Simple entities, even valuable entities, are rarely a challenge to manage and are rarely considered master-data elements. The less complex an element, the less likely the need to manage change for that element. Typically, such assets are simply collected and tallied. For example, Fort Knox likely would not track information on each individual gold bar stored there, but rather only keep a count of them. The value of each gold bar is substantial, the cardinality high, and the lifespan long; yet, the complexity is low.

Value
The more valuable the data element is to the company, the more likely it will be considered a master data element. Value and complexity work together.

Volatility
While master data is typically less volatile than transactional data, entities with attributes that do not change at all typically do not require a master-data solution. For example, rare coins would seem to meet many of the criteria for a master-data treatment. A rare-coin collector would likely have many rare coins. So, cardinality is high. They are valuable. They are also complex. For example, rare coins have a history and description. There are attributes, such as condition of obverse, reverse, legend, inscription, rim, and field. There are other attributes, such as designer initials, edge design, layers, and portrait. Yet, rare coins do not need to be managed as a master-data item, because they don't change over timeor, at least, they don't change enough. There may need to be more information added, as the history of a particular coin is revealed or if certain attributes must be corrected. But, generally speaking, rare coins would not be managed through a master-data management system, because they are not volatile enough to warrant a solution.

Reuse
One of the primary drivers of master-data management is reuse. For example, in a simple world, the CRM system would manage everything about a customer and never need to share any information about the customer with other systems. However, in today's complex environments, customer information needs to be shared across multiple applications. That's where the trouble begins. Becausefor a number of reasonsaccess to a master datum is not always available, people start storing master data in various locations, such as spreadsheets and application private stores. There are still reasons, such as data-quality degradation and decay, to manage master data that is not reused across the enterprise. However, if a master-data entity is reused in multiple systems, it's a sure bet that it should be managed with a master-data management system. To summarize, while it is simple to enumerate the various master-data entity types, it is sometimes more challenging to decide which data items in a company should be treated as master data. Often, data that does not normally comply with the definition for master data may need to be managed as such, and data that does comply with the definition may not. Ultimately, when deciding on what entity types should be treated as master data, it is better to categorize them in terms of their behavior and attributes within the context of the business needs than to rely on simple lists of entity types.

Why Should I Manage Master Data?


Because it is used by multiple applications, an error in master data can cause errors in all the applications that use it. For example, an incorrect address in the customer master might mean orders, bills, and marketing literature are all sent to the wrong address. Similarly, an incorrect price on an item master can be a marketing disaster, and an incorrect account number in an Account Master can lead to huge fines or even jail time for the CEOa career-limiting move for the person who made the mistake! Here is a typical master-data horror story: A credit-card customer moves from 2847 North 9th St. to 1001 11th St. North. The customer changed his billing address immediately, but did not receive a bill for several months. One day, the customer received a threatening phone call from the creditcard billing department, asking why the bill has not been paid. The customer verifies that they have the new address, and the billing department verifies that the address on file is 1001 11th St. N. The customer asks for a copy of the bill, to settle the account. After two more weeks without a bill, the customer calls back and finds the account has been turned over to a collection agency.

This time, they find out that even though the address in the file was 1001 11th St. N, the billing address is 101 11th St. N. After a bunch of phone calls and letters between lawyers, the bill finally gets resolved and the credit-card company has lost a customer for life. In this case, the master copy of the data was accurate, but another copy of it was flawed. Master data must be both correct and consistent. Even if the master data has no errors, few organizations have just one set of master data. Many companies grow through mergers and acquisitions. Each company you acquire comes with its own customer master, item master, and so forth. This would not be bad if you could just Union the new master data with your current master data, but unless the company you acquire is in a completely different business in a faraway country, there's a very good chance that some customers and products will appear in both sets of master datausually, with different formats and different database keys. If both companies use the Dun & Bradstreet number or Social Security number as the customer identifier, discovering which customer records are for the same customer is a straightforward issue; but that seldom happens. In most cases, customer numbers and part numbers are assigned by the software that creates the master records, so the chances of the same customer or the same product having the same identifier in both databases is pretty remote. Item masters can be even harder to reconcile, if equivalent parts are purchased from different vendors with different vendor numbers. Merging master lists together can be very difficult. The same customer may have different names, customer numbers, addresses, and phone numbers in different databases. For example, William Smith might appear as Bill Smith, Wm. Smith, and William Smithe. Normal database joins and searches will not be able to resolve these differences. A very sophisticated tool that understands nicknames, alternate spellings, and typing errors will be required. The tool will probably also have to recognize that different name variations can be resolved, if they all live at the same address or have the same phone number. While creating a clean master list can be a daunting challenge, there are many positive benefits to your bottom line from a common master list:

A single, consolidated bill saves money and improves customer satisfaction. Sending the same marketing literature to a customer from multiple customer lists wastes money and irritates the customer. Before you turn a customer account over to a collection agency, it would be good to know if they owe other parts of your company money or, more importantly, that they are another division's biggest customer. Stocking the same item under different part numbers is not only a waste of money and shelf space, but can potentially lead to artificial shortages.

The recent movements toward SOA and SaaS make Master Data Management a critical issue. For example, if you create a single customer service that communicates through well-defined XML messages, you may think you have defined a single view of your customers. But if the same customer is stored in five databases with three different addresses and four different phone numbers, what will your customer service return? Similarly, if you decide to subscribe to a CRM service provided through SaaS, the service provider will need a list of customers for their database. Which one will you send them? For all these reasons, maintaining a high-quality, consistent set of master data for your organization is rapidly becoming a necessity. The systems and processes required to maintain this data are known as Master Data Management.

What Is Master Data Management?


For purposes of this article, we define Master Data Management (MDM) as the technology, tools, and processes required to create and maintain consistent and accurate lists of master data. There are a couple things worth noting in this definition. One is that MDM is not just a technological problem. In many cases, fundamental changes to business process will be required to maintain clean master data, and some of the most difficult MDM issues are more political than technical. The

second thing to note is that MDM includes both creating and maintaining master data. Investing a lot of time, money, and effort in creating a clean, consistent set of master data is a wasted effort unless the solution includes tools and processes to keep the master data clean and consistent as it is updated and expanded. While MDM is most effective when applied to all the master data in an organization, in many cases the risk and expense of an enterprise-wide effort are difficult to justify. It may be easier to start with a few key sources of Master Data and expand the effort, once success has been demonstrated and lessons have been learned. If you do start small, you should include an analysis of all the master data that you might eventually want to include, so you do not make design decisions or tool choices that will force you to start over when you try to incorporate a new data source. For example, if your initial Customer master implementation only includes the 10,000 customers your direct-sales force deals with, you don't want to make design decisions that will preclude adding your 10,000,000 Web customers later. An MDM project plan will be influenced by requirements, priorities, resource availability, time frame, and the size of the problem. Most MDM projects include at least these phases:

1. Identify sources of master data. This step is usually a very revealing exercise. Some 2.

3.

4.

5.

6.

companies find they have dozens of databases containing customer data that the IT department did not know existed. Identify the producers and consumers of the master data. Which applications produce the master data identified in the first step, andgenerally more difficult to determinewhich applications use the master data. Depending on the approach you use for maintaining the master data, this step might not be necessary. For example, if all changes are detected and handled at the database level, it probably does not matter where the changes come from. Collect and analyze metadata about for your master data. For all the sources identified in step one, what are the entities and attributes of the data, and what do they mean? This should include attribute name, datatype, allowed values, constraints, default values, dependencies, and who owns the definition and maintenance of the data. The owner is the most important and often the hardest to determine. If you have a repository loaded with all your metadata, this step is an easy one. If you have to start from database tables and source code, this could be a significant effort. Appoint data stewards. These should be the people with the knowledge of the current source data and the ability to determine how to transform the source into the master-data format. In general, stewards should be appointed from the owners of each master-data source, the architects responsible for the MDM systems, and representatives from the business users of the master data. Implement a data-governance program and data-governance council. This group must have the knowledge and authority to make decisions on how the master data is maintained, what it contains, how long it is kept, and how changes are authorized and audited. Hundreds of decisions must be made in the course of a master-data project, and if there is not a well-defined decision-making body and process, the project can fail, because the politics prevent effective decision making. Develop the master-data model. Decide what the master records look like: what attributes are included, what size and datatype they are, what values are allowed, and so forth. This step should also include the mapping between the master-data model and the current data sources. This is normally both the most important and most difficult step in the process. If you try to make everybody happy by including all the source attributes in the master entity, you often end up with master data that is too complex and cumbersome to be useful. For example, if you cannot decide whether weight should be in pounds or kilograms, one approach would be to include both (WeightLb and WeightKg). While this might make people happy, you are wasting megabytes of storage for numbers that can be calculated in microseconds, as well as running the risk of creating inconsistent data (WeightLb = 5 and WeightKg = 5). While this is a pretty trivial example, a bigger issue would be maintaining multiple part numbers for the same part. As in any committee effort, there will be fights and deals resulting in sub-optimal decisions. It's important to work out

the decision process, priorities, and final decision maker in advance, to make sure things run smoothly. 7. Choose a toolset. You will need to buy or build tools to create the master lists by cleaning, transforming, and merging the source data. You will also need an infrastructure to use and maintain the master list. These functions are covered in detail later in the paper. You can use a single toolset from a single vendor for all of these functions, or you might want to take a best-of-breed approach. In general, the techniques to clean and merge data are different for different types of data, so there are not a lot of tools that span the whole range of master data. The two main categories of tools are Customer Data Integration (CDI) tools for creating the customer master and Product Information Management (PIM) tools for creating the product master. Some tools will do both, but generally they are better at one or the other. The toolset should also have support for finding and fixing data-quality issues and maintaining versions and hierarchies. Versioning is a critical feature, because understanding the history of a master-data record is vital to maintaining its quality and accuracy over time. For example, if a merge tool combines two records for John Smith in Boston, and you decide there really are two different John Smiths in Boston, you need to know what the records looked like before they were merged, in order to "unmerge" them.

8. Design the infrastructure. Once you have clean, consistent master data, you will need to

expose it to your applications and provide processes to manage and maintain it. This step is a big-enough issue, I devote a section to it later in the document. When this infrastructure is implemented, you will have a number of applications that will depend on it being available, so reliability and scalability are important considerations to include in your design. In most cases, you will have to implement significant parts of the infrastructure yourself, because it will be designed to fit into your current infrastructure, platforms, and applications. 9. Generate and test the master data. This step is where you use the tools you have developed or purchased to merge your source data into your master-data list. This is often an iterative process requiring tinkering with rules and settings to get the matching right. This process also requires a lot of manual inspection to ensure that the results are correct and meet the requirements established for the project. No tool will get the matching done correctly 100 percent of the time, so you will have to weigh the consequences of false matches versus missed matches to determine how to configure the matching tools. False matches can lead to customer dissatisfaction, if bills are inaccurate or the wrong person is arrested. Too many missed matches make the master data less useful, because you are not getting the benefits you invested in MDM to get. 10. Modify the producing and consuming systems. Depending on how your MDM implementation is designed, you might have to change the systems that produce, maintain, or consume master data to work with the new source of master data. If the master data is used in a system separate from the source systemsa data warehouse, for examplethe source systems might not have to change. If the source systems are going to use the master data, however, there will likely be changes required. Either the source systems will have to access the new master data or the master data will have to be synchronized with the source systems, so that the source systems have a copy of the cleaned-up master data to use. If it's not possible to change one or more of the source systems, either that source system might not be able to use the master data or the master data will have to be integrated with the source system's database through external processes, such as triggers and SQL commands. The source systems generating new records should be changed to look up existing master record sets before creating new records or updating existing master records. This ensures that the quality of data being generated upstream is good, so that the MDM can function more efficiently and the application itself manages data quality. MDM should be leveraged

not only as a system of record, but also as an application that promotes cleaner and more efficient handling of data across all applications in the enterprise. As part of MDM strategy, all three pillars of data management need to be looked into: data origination, data management, and data consumption. It is not possible to have a robust enterprise-level MDM strategy if any one of these aspects is ignored.

11. Implement the maintenance processes. As we stated earlier, any MDM implementation
must incorporate tools, processes, and people to maintain the quality of the data. All data must have a data steward who is responsible for ensuring the quality of the master data. The data steward is normally a business person who has knowledge of the data, can recognize incorrect data, and has the knowledge and authority to correct the issues. The MDM infrastructure should include tools that help the data steward recognize issues and simplify corrections. A good data-stewardship tool should point out questionable matches that were madecustomers with different names and customer numbers that live at the same address, for example. The steward might also want to review items that were added as new, because the match criteria were close but below the threshold. It is important for the data steward to see the history of changes made to the data by the MDM systems, to isolate the source of errors and undo incorrect changes. Maintenance also includes the processes to pull changes and additions into the MDM system, and to distribute the cleansed data to the required places.

As you can see, MDM is a complex process that can go on for a long time. Like most things in software, the key to success is to implement MDM incrementally, so that the business realizes a series of short-term benefits while the complete project is a long-term process. No MDM project can be successful without the support and participation of the business users. IT professionals do not have the domain knowledge to create and maintain high-quality master data. Any MDM project that does not include changes to the processes that create, maintain, and validate master data is likely to fail. The rest of this paper will cover the details of the technology and processes for creating and maintaining master data.

How Do I Create a Master List?


Whether you buy a tool or decide to roll your own, there are two basic steps to creating master data: clean and standardize the data, and match data from all the sources to consolidate duplicates. Before you can start cleaning and normalizing your data, you must understand the data model for the master data. As part of the modeling process, the contents of each attribute were defined, and a mapping was defined from each source system to the master-data model. This information is used to define the transformations necessary to clean your source data. Cleaning the data and transforming it into the master data model is very similar to the Extract, Transform, and Load (ETL) processes used to populate a data warehouse. If you already have ETL tools and transformation defined, it might be easier just to modify these as required for the master data, instead of learning a new tool. Here are some typical data-cleansing functions:

Normalize data formats. Make all the phone numbers look the same, transform addresses (and so on) to a common format. Replace missing values. Insert defaults, look up ZIP codes from the address, look up the Dun & Bradstreet number. Standardize values. Convert all measurements to metric, convert prices to a common currency, change part numbers to an industry standard. Map attributes. Parse the first name and last name out of a contact-name field, move Part# and partno to the PartNumber field.

Most tools will cleanse the data that they can, and put the rest into an error table for hand processing. Depending on how the matching tool works, the cleansed data will be put into a master

table or a series of staging tables. As each source is cleansed, the output should be examined to ensure the cleansing process is working correctly. Matching master-data records to eliminate duplicates is both the hardest and most important step in creating master data. False matches can actually lose data (two Acme Corporations become one, for example) and missed matches reduce the value of maintaining a common list. The matching accuracy of MDM tools is one of the most important purchase criteria. Some matches are pretty trivial to do. If you have Social Security numbers for all your customers, or if all your products use a common numbering scheme, a database JOIN will find most of the matches. This hardly ever happens in the real world, however, so matching algorithms are normally very complex and sophisticated. Customers can be matched on name, maiden name, nickname, address, phone number, credit-card number, and so on, while products are matched on name, description, part number, specifications, and price. The more attribute matches and the closer the match, the higher degree of confidence the MDM system has in the match. This confidence factor is computed for each match, and if it surpasses a threshold, the records match. The threshold is normally adjusted depending on the consequences of a false match. For example, you might specify that if the confidence level is over 95 percent, the records are merged automatically, and if the confidence is between 80 percent and 95 percent, a data steward should approve the match before they are merged. Most merge tools merge one set of input into the master list, so the best procedure is to start the list with the data in which you have the most confidence, and then merge the other sources in one at a time. If you have a lot of data and a lot of problems with it, this process can take a long time. You might want to start with the data from which you expect to get the most benefit having consolidated; run a pilot project with that data, to ensure your processes work and you are seeing the business benefits you expect; and then start adding other sources, as time and resources permit. This approach means your project will take longer and possibly cost more, but the risk is lower. This approach also lets you start with a few organizations and add more as the project demonstrates success, instead of trying to get everybody on board from the start. Another factor to consider when merging your source data into the master list is privacy. When customers become part of the customer master, their information might be visible to any of the applications that have access to the customer master. If the customer data was obtained under a privacy policy that limited its use to a particular application, you might not be able to merge it into the customer master. You might want to add a lawyer to your MDM planning team. At this point, if your goal was to produce a list of master data, you are done. Print it out or burn it to a CD, and move on. If you want your master data to stay current as data is added and changed, you will have to develop infrastructure and processes to manage the master data over time. The next section provides some options on how to do just that.

How Do I Maintain a Master List?


There are many different tools and techniques for managing and using master data. We will cover three of the more common scenarios here:

Single-copy approachIn this approach, there is only one master copy of the master data. All additions and changes are made directly to the master data. All applications that use master data are rewritten to use the new data instead of their current data. This approach guarantees consistency of the master data, but in most cases it's not practical. Modifying all your applications to use a new data source with a different schema and different data is, at least, very expensive; if some of your applications are purchased, it might even be impossible. Multiple copies, single maintenanceIn this approach, master data is added or changed in the single master copy of the data, but changes are sent out to the source systems in which copies are stored locally. Each application can update the parts of the data that are not part of the master data, but they cannot change or add master data. For example, the inventory system might be able to change quantities and locations of parts,

but new parts cannot be added, and the attributes that are included in the product master cannot be changed. This reduces the number of application changes that will be required, but the applications will minimally have to disable functions that add or update master data. Users will have to learn new applications to add or modify master data, and some of the things they normally do will not work anymore. Continuous mergeIn this approach, applications are allowed to change their copy of the master data. Changes made to the source data are sent to the master, where they are merged into the master list. The changes to the master are then sent to the source systems and applied to the local copies. This approach requires few changes to the source systems; if necessary, the change propagation can be handled in the database, so no application code is changed. On the surface, this seems like the ideal solution. Application changes are minimized, and no retraining is required. Everybody keeps doing what they are doing, but with higher-quality, more complete data. This approach does have several issues: o Update conflicts are possible and difficult to reconcile. What happens if two of the source systems change a customer's address to different values? There's no way for the MDM system to decide which one to keep, so intervention by the data steward is required; in the meantime, the customer has two different addresses. This must be addressed by creating data-governance rules and standard operating procedures, to ensure that update conflicts are reduced or eliminated. o Additions must be remerged. When a customer is added, there is a chance that another system has already added the customer. To deal with this situation, all data additions must go through the matching process again to prevent new duplicates in the master. o Maintaining consistent values is more difficult. If the weight of a product is converted from pounds to kilograms and then back to pounds, rounding can change the original weight. This can be disconcerting to a user who enters a value and then sees it change a few seconds later.

In general, all these things can be planned for and dealt with, making the user's life a little easier, at the expense of a more complicated infrastructure to maintain and more work for the data stewards. This might be an acceptable trade-off, but it's one that should be made consciously.

Versioning and Auditing


No matter how you manage your master data, it's important to be able to understand how the data got to the current state. For example, if a customer record was consolidated from two different merged records, you might need to know what the original records looked like, in case a data steward determines that the records were merged by mistake and really should be two different customers. The version management should include a simple interface for displaying versions and reverting all or part of a change to a previous version. The normal branching of versions and grouping of changes that source-control systems use can also be very useful for maintaining different derivation changes and reverting groups of changes to a previous branch. Data stewardship and compliance requirements will often include a way to determine who made each change and when it was made. To support these requirements, an MDM system should include a facility for auditing changes to the master data. In addition to keeping an audit log, the MDM system should include a simple way to find the particular change you are looking for. An MDM system can audit thousands of changes a day, so search and reporting facilities for the audit log are important.

Hierarchy Management
In addition to the master data itself, the MDM system must maintain data hierarchiesfor example, bill of materials for products, sales territory structure, organization structure for customers, and so forth. It's important for the MDM system to capture these hierarchies, but it's also useful for an MDM system to be able to modify the hierarchies independently of the underlying systems. For example, when an employee moves to a different cost center, there might be impacts to the Travel

and Expense system, payroll, time reporting, reporting structures, and performance management. If the MDM system manages hierarchies, a change to the hierarchy in a single place can propagate the change to all the underlying systems. There might also be reasons to maintain hierarchies in the MDM system that do not exist in the source systems. For example, revenue and expenses might need to be rolled up into territory or organizational structures that do not exist in any single source system. Planning and forecasting might also require temporary hierarchies to calculate "what if" numbers for proposed organizational changes. Historical hierarchies are also required in many cases to roll up financial information into structures that existed in the past, but not in the current structure. For these reasons, a powerful, flexible hierarchy-management feature is an important part of an MDM system.

You might also like