Emerging predictive analytics for the manufacturing industry

Dan Somers
- Technology - Nov 26, 2014

The field of ‘Predictive Analytics’, a field of analytics which tries to predict future events by finding patterns from historical events, is receiving much attention in manufacturing lately, due in large to the rise of ‘Big Data’.

The predictive analytics market is in fact growing at an unprecedented rate. ResearchMoz recently valued the market at $2.1 billion in 2012, and predicts it to see strong growth at 17.8 percent CAGR to 2019.

With this though comes the ‘hype’. Gartner’s ‘Hype Cycle’ predicts that predictive analytics and data science are approaching the ‘Peak of Inflated Expectations’ – this is when the early publicity around the technology starts. However the ‘Plateau of Productivity’, when mainstream adoption starts to take off, for data science will be reached in two to five years. Meanwhile, predictive analytics will take five to 10 years to reach the end of the hype cycle and maturity.

For an industry which traditionally suffers acutely from incomplete, disparate and dirty data and requires solutions to issues to be found as early as possible the need for such analytics in manufacturing is prominent. However, manufacturing is also a no-nonsense industry populated by practically-minded engineers and business leaders who are rightly sceptical of hyped technologies. So has the time for manufacturing predictive analytics come or is it just the big data vendors trying to persuade us? Let’s answer this by debunking the myths of manufacturing predictive analytics.

Firstly it is important to start by saying that ‘visualisation’ is not analytics despite what many vendors tell you. Being able to see and play around with data is useful, but if say an engineer wants to know what his COPQ is likely to be over the next few months and the factors which are driving the forecast, this is predictive analytics.

Secondly, there are many challenges with implementing and operating the common predictive analytics tools already on the market. Firstly it is too easy to produce a predictive model which is either naively incorrect to start with (because it requires a data scientist to build the model and/or validate) or soon becomes irrelevant (because things change).  Secondly there is a mathematical limit to what predictions can be made with a specific amount of data - if a vendor tells you he can predict the future from a just few data points, then it’s too good to be true.

Thirdly, if the analytics can’t be practically implemented on the shop floor then it is useless.

Lastly, there has to be a clear use case on which to apply the analyses to in the first place. If a problem requires an army of PhDs or the problem is not that large, then think again. Predictive analytics isn’t the answer to every prayer.

So, what kind of manufacturing problems are placed to benefit from predictive analytics? Well, there’s root cause analysis for resolving defects to lower COPQ , predictive maintenance and service on products and plant, hidden bottlenecks in processes, speeding up launch, lowering supply chain costs by understanding behaviours, improving the design of products by understanding how customers will use them better, product pricing…the list goes on.

For engineers, it’s a case of replacing ‘rear-view mirror analytics’ using statistical tools, with forward looking analytics which enable results to be understood and most importantly proven – perhaps ‘practical predictive analytics’.

So now we’ve debunked the topic, what about the technical hurdles? In the 2014 TDWI survey, the top three predictive analytics challenges across all sectors were: lack of skilled personnel; lack of understanding of predictive analytics technology; and an inability to assemble the necessary data.

Fortunately there are predictive analytics companies appearing who address these challenges by automating the data integration exercise, not needing to clean data (in fact positively refusing to clean data lest it remove early warning signals), dealing with sketchy data, and automating the analytics.

These new technologies look at all or whatever data is available. They are can run fast within the database or parallelised ‘in-memory’ to handle huge amounts of data from various disparate sources. They provide multiple solutions and the results are easy to understand and action. And no hypotheses are required.

Put simply, it means that a data scientist and IT department are no longer required to solve complex, and ever-changing problems. The technology at our company for example, Warwick Analytics, has been used to solve the root causes of complex failures at Motorola (the home of Six Sigma) by engineers, quality and production staff. Similarly the technology is used by many leading automotive aerospace and process industry manufacturers to reduce their Cost of Poor Quality by automatically resolving issues on-the-fly and increasing yield.

Adoption and deployment of these emerging predictive analytics tools will greatly enhance the ability of firms to move quicker and more cost-effectively by making predictive analytics available to everybody across all of their manufacturing processes. 

Like what you see! Signup for our weekly newsletter