Spend Mapping
Spend Mapping
taCtiCs
TaCTiCS
pETEr graNgEr
trategic planning, an essential first step in the development of a resultsbased accountability system, is defined as the process of addressing the following questions: n Where are we? n What do we have to work with? n Where do we want to be? n How do we get there? This generic application of strategic planning applies to Procurement in the same way as it applies to most other critical business activities. Often the hardest part is to obtain an accurate and accessible answer to the first two questions, which is simply to understand in practical terms your current state of play. Spend analysis performs this role in the procurement environment. In specific terms, spend analysis has been defined as the process of aggregating, classifying and leveraging spend data for the purpose of cost reduction, performance improvement and compliance (remembering that we are striving for results based accountability). Typically when we hold discussions with procurement professionals about their spend analytics activities, we encounter several common objections: 1. Our raw data is very dirty and all we would do is to classify rubbish. 2. Our data is spread across too many systems and has completely inconsistent coding structures. 3. We have a massive investment in a corporate data warehouse and all our data is stored there. Usually followed by: Even though the data warehouse does nothing for me, the politics of using an alternative are not sustainable in this organisation.
Limitations of space prevent a discussion of the complex issue of data warehouses and spend analytics, so we will focus here just on the issue of data.
At the beginning, the mountain looks just too big to climb but like anything else, it is remarkable what can be achieved when such a task is broken into manageable chunks. Commodity mapping is a very good example of one such mountain because rarely does an organisation have a universal set of commodity codes such as UNSPC embedded in its purchase order or payment systems. Lets look at the following approach espoused by Eric Strovink, an associate of ours and a veteran software executive who has been closely involved with the emergence of the spend analytics industry in the USA.
taCtiCs
allows them to focus their spend management efforts (and further cleansing) on the most promising commodities. Is it necessary to map every vendor? Almost never, although third-party vendor mapping services are readily available if needed. Is it necessary to conduct a vendor family-ing? Grouping together multiple instances of the same vendor clears up more than 95% of the problem. Whoowns-whom family-ing using commercial databases seldom provides additional insight; besides, inside buyers are usually well aware of the few relationships that actually matter. For example, you won't necessarily get any savings from Wesfarmers by buying chemicals from CSBP and safety wear from Bunnings. And it would be a mistake to group Harvey Norman under their owners, since they are mainly franchisees. There will be cases where insufficient data exists to use classical mapping techniques. For example, if the dataset is limited to line item descriptions, then
if you have limited resources, it can be counterproductive to start mapping commodities that are likely not to produce savings, when good estimates can often be made as to where the big hits are likely to be.
phrase mapping is required; if the dataset has vendor information only, then vendor mapping is the only alternative. Commodity maps based on insufficient data are inaccurate commodity maps, but they are better than nothing. 80-20 logic also applies to the overall spend mapping problem. Consider a financial services firm with an indirect spend base. Before even starting to look at the data, every veteran sourcer knows where to start looking first for potential savings: contract labour, commercial print, PCs and computing, and so on. If you have limited resources, it can be counterproductive to start mapping commodities that are likely not to produce savings, when good estimates can often be made as to where the big hits are likely to be. If you can score some successes now, there will be plenty of time to extend the reach of the system later. If there are sufficient resources to attack only a couple of commodities, it makes sense to focus on those commodities alone, rather than attempting to map the entire commodity tree. The bottom line is that data cleansing needn't be a complex, expensive, offline process. By applying common sense to the cleansing problem, i.e. by attacking it incrementally and intelligently over time, mapping rules can be developed, refined, and applied when needed.
taCtiCs
5
recognises that a mistake has been made in the mapping decisions, they can immediately change the mapping rules, re-run the transform and quickly re-build the analytic cube. Importantly, volumes are rarely a limitation. PC power is cheap and if the tool is well architected, millions of records can be re-processed and built into cubes in a matter of minutes. In fact, whether you choose to have an initial spend dataset created by outside resources, or you decide to create it yourself, the conclusion is the same: cleansing and mapping should be an online, ongoing process, guided by feedback and insight
gleaned directly (and incestuously) from the powerful visibility tools of the spend analysis system itself. And, as a corollary, cleansing tools must be placed directly into the hands of purchasing professionals so that they can create and refine mappings on-the-fly, without any assistance n from vendors or internal IT experts. Peter Granger is the CEO of Inlogik Pty Ltd, an Australian-based provider of spend management solutions locally and internationally. The author wishes to acknowledge the contribution to this article of Eric Strovink, the founder and chief executive of BIQ LLC.
The bottom line is that data cleansing needn't be a complex, expensive, offline process. by applying common sense to the cleansing problem, i.e. by attacking it incrementally and intelligently over time, mapping rules can be developed, refined, and applied when needed.
pRoJECt pRoCuRE