Find answers, ask experts and talk with the procurement community
Do you want to deliver savings faster, reduce risks and transform functional performance?
Mergers, acquisitions, and expansions into emerging markets, are increasingly the norm for businesses today. This brings both business and supply chain growth, meaning procurement is left grappling under efforts to maintain control and visibility of spend.
In April, cloud-based software company Coupa held its third European Executive Symposium in London, bringing together executives to share best practice of how artificial intelligence (AI) can help provide visibility over spend.
Spend data is a significant concern. Attendees highlighted three issues that make it increasingly challenging to cut through the noise in the supply base to get to grips with the scope of spend. First, master data – not to mention big data – is unmanageable in terms of volume. Second, taxonomy can differ between the organisation and the supply base. Third, there is sometimes a lack of understanding of the parent/child relationship between buyers and suppliers.
The fundamental lesson here is to use the intelligence and information that procurement already has access to, in order to deliver predictive analytics on what is to come.
This is where AI can step in.
According to PwC’s 18th Annual Global Survey, 80% of CEOs now see data mining and analytical technology as the most important technology to their current strategy. While mobile technology was central previously cited as the key technology by most CEOs, mobile has become a thing of the past and artificial intelligence (AI), long held up as the future, is the present.
Andrew Knotts, senior director of customer operations and success at Coupa, referred to procurement entering the third-generation of data-processing technology, that is, the point at which information can be interpreted through machine learning into intelligent algorithms.
The key to the latest generation of AI is that the systems do not use traditional keyword search, instead, a vector representation, or number, is applied to each word and neural networks act as the machine’s internal memory – much like the human neurone system. Comparable words have similar vector representations, yet the probability and weighting differs depending on both the input of human learning and data, and the ongoing intelligence learned by the machine itself. Unnecessary ‘noise’ is removed, dirty data is cleansed and useful vector representations are passed through the network, resulting in accurate and predictive spend analytics.
The introduction of AI processes can help procurement to identify duplicate suppliers and duplicate spend, and unearth ‘hidden’ spend. If executives have a better handle on this information, then better-informed sourcing decisions can be made.
Implementing AI is an ongoing process that must be constantly refreshed and tested, Knotts told the audience. Procurement must take a cyclical approach to using the technology, broadening the data set available to the machine, extracting data related to the current strategy, normalising suppliers and spend across the supply base, before repeating the same process a few months later.
AI, machine learning and deep learning are now a reality, and taking advantage of these capabilities means the difference between a function that has a handle on its spend and one that continues to be somewhere in the dark.
This article is a piece of independent writing by a member of Procurement Leaders’ content team.
© Sigaria Ltd and its contributors. All rights reserved. www.sigaria.com
Sigaria accepts no responsibility for advice or information contained on this site although every effort is made to ensure its accuracy. Users are advised to seek independent advice from qualified persons before acting upon any such information.