We are at the beginning of a supply chain technological revolution. Everything—from pallets and trucks to a bag of Romaine lettuce—has the potential to collect, store, and transmit information. If only companies could take this data and feed it into analytics and artificial intelligence systems, they could make better, faster decisions than ever before, driving down waste and increasing value. That, at least, is the vision of what big data analytics could achieve.
But now, reality is beginning to set in. As more companies try to wrap their arms around their "big data" and implement more complex analytic tools, they are beginning to realize that achieving this vision is hard work. It takes significant investment in information technology (IT) systems as well as change and process management. Companies are also finding that the key data that they need is often missing or inaccurate. The promise of big data analytics is coming, but perhaps not as easily as initially thought. That was the main message conveyed by the results of the "Second Annual Big Data Analytics Study" conducted by the analytics company Competitive Insights LLC; the consultancy lharrington group; CSCMP's Supply Chain Quarterly; and two prominent supply chain management schools, Arizona State University and Colorado State University.
This annual study is designed to provide companies with a benchmark that they can use to understand the current state of supply chain data analytics and learn what analytical strategies organizations are adopting to harness the power of big data. The intent is to show the levels of progress that companies are making in addition to the obstacles impeding that progress.
A survey was conducted in both 2017 and 2018 with readers of Supply Chain Quarterly, subscribers to a newsletter produced by Competitive Insights, and a contact list generated by Arizona State and Colorado State University researchers. A total of 125 usable responses were compiled for the 2018 survey, comparable to 2017's total of 133 usable responses.
Here are the findings and some suggested best practices that could help companies overcome their initial frustration and gain positive momentum with their big data analytics implementations.
More implementations, less satisfaction
A comparison of 2018's and 2017's results tell an interesting story. In both 2017 and 2018, we asked people: "How would you characterize your supply chain organization's maturity in regard to its use of big data analytics?" (See Figure 1.) In general, the survey results show that more companies have begun implementing big data analytics initiatives. There was a 14-percent increase in the number of big data implementations between 2017 and 2018, and far fewer people reported that they had not adopted supply chain analytics at all (only 10 percent in 2018 as opposed to 23 percent in 2017). And yet, Figure 1 also shows that there were fewer people reporting that their implementations were "transformational" or "advanced." Instead, the majority of respondents were in the "early" or "developing" stages of adoption, indicating that most companies are either still conducting proof-of-concept testing or have only rolled out initial implementations.
Why was there a drop off in the number of "transformational" and "advanced" responses in 2018 from 2017? We do not believe that firms are necessarily less successful with their big data analytics implementations this year as opposed to 2017. Instead, responding firms' definition of "transformational" may have evolved since last year. As more companies implement big data analytics in earnest, they are developing a better understanding of what it entails and how much farther they have to go. In other words, companies now have a more realistic assessment of where they actually fall on the maturity curve and the extent of the obstacles they must overcome. The widespread excitement of 2017 has been replaced by the realization that, like almost all new technologies, it takes a lot of work to make big data analytics useful. Companies, as a result, are more realistic about their progress along the maturity curve.
We also see this perspective in respondent satisfaction with the data they do have. In general, companies are less satisfied with the quality and availability of their data in 2018 than they were in 2017. On average, respondents in 2018 were 4 percent less satisfied with their data availability, 8 percent less satisfied with data usability, and 7 percent less satisfied with their data's integrity than respondents in 2017. Satisfaction with data reliability also dropped 5 percent between 2018 and 2017. It seems unlikely that data quality actuallyÂ dropped from 2017 to 2018. Rather, we believe that as companies get deeper into their big data analytics implementations, they are becoming more aware of their existing data issues and have a better understanding of the magnitude of the effort and commitment required to address them.
It stands to reason that if respondents are less satisfied with the quality of their data, they are going to be less satisfied with their data analytics results as well. Indeed, the 2018 survey showed a slight drop in respondents' assessment of the realized benefits of their big data analytics efforts in 2018 relative to 2017. In both years, survey respondents were asked to use a seven-point scale to quantify the impact they have already realized from big data analytics in a variety of areas, such as profitability, inventory management, and visibility to total cost-to-serve. A score of 1 equals no impact, and 7 equals a transformative impact. Figure 2 shows a slight drop in perceived impact across all of the potential benefits in 2018 versus 2017.
The type of analytics matters
When assessing the benefits that can be expected from a big data analytics implementation, it is crucial to understand that there are many types of analytics. For the survey, we defined the different types of data analytics as follows:
Survey respondents were again asked to select from a seven-point scale (where a score of 1 equals no use, and a score of 7 equals heavy use) the extent that their company currently uses each of those types of analytics to support supply chain decision making. On average, respondents gave descriptive analytics a score of 4.61 (indicating between "some use" and "regular use"), diagnostic analytics received a score of 4.02, predictive analytics a score of 3.16 (indicating between "some use" and "occasional use"), 3.56 for prescriptive analytics, and 2.27 (or "infrequent use") for cognitive analytics. (See Figure 3.) These scores are similar or slightly less than what was seen last year. The survey results indicate that more sophisticated types of analytics are still used less than more rudimentary methods in many real-life supply chains.
Further investigation using regression analyses on the responses identified a significant correlation between the type of analytics that firms use and the benefits that they report achieving.1 Generally, more sophisticated analytics are associated with a wider range of benefits. Figure 4 shows that the use of descriptive analytics is only correlated with improvements in customer service. This finding makes sense, as knowing what is happening in their supply chain can help companies better inform their customers of the problems they are facing before these issues become visible to them. While use of diagnostic analytics was slightly associated with demand planning and highly correlated with collaboration, no correlation could be found with any of the other realized benefits. (By "slight," we mean that we are greater than 90 percent certain there is a correlation, instead of greater than 95 percent sure there is a correlation.) Predictive analytics, however, couldÂ be linked to many benefits, such as improvements in demand planning, risk management, and collaboration.
Somewhat counterintuitively, prescriptive analytics was not positively related with any of these realized benefits. In fact, it was slightly negatively correlated to improvements in demand planning. It's possible that by the time firms get to the level of prescriptive analytics, they not only have made some improvements to their demand planning but also are more aware of their issues. Without the capabilities of more advanced analytics, they may have trouble addressing these issues which they now have more insight into. In many ways, this negative correlation is a microcosm of the trends we see throughout the 2018 report. Firms are beginning to implement more sophisticated protocols, and, in the process, are beginning to realize the obstacles they have yet to overcome.
Finally, we found that the most helpful type of analytics was cognitive, with a strong correlation to risk management and productivity and a slight correlation to customer service, visibility, and collaboration. In other words, the more sophisticated and forward-looking the type of analytics that are being used, the more benefits companies are realizing.
Similar results can be observed when considering the types of software tools that companies are employing to perform their analytics. Survey respondents were again asked to use a seven-point scale (where a score of 1 equals "no use," a score of 4 equals "some use," and a score of 7 signifies "heavy use") to measure the extent that their company currently uses a variety of analytics tools. Those tools included: Microsoft Excel or similar spreadsheet programs; operational point solutions (OPS) such as warehouse management systems and transportation management systems; advanced analytical tools associated with enterprise resource planning (ERP) systems; and business intelligence (BI) tools. As was the case in last year's survey results, Excel spreadsheets continue to carry the day and are by far the most widely used analytics tool with an average score of 5.80, indicating "frequent use." OPS received an average score of 4.64, and ERP and business intelligence tools received scores of 3.97 and 3.88 respectively.
The survey team also ran multiple analyses to see if there were any links between realized tangible big data analytical benefits and the use of these platforms. (See Figure 5.) While Microsoft Excel was the most widely used of these platforms, it also was the least useful. In fact, no correlation was established between the use of either Excel or an OPS as the primary big data analysis platform and any tangible improvement in customer service, demand planning, risk management, supply chain visibility, collaboration, or overall productivity. This demonstrates that for many companies, the current methods for analyzing big data are ineffective. They may be useful for maintaining the status quo, but they are unlikely to lead to many benefits. Without a change in methods, it is unlikely firms will see a change in results.
Conversely, although ERP and BI systems were the least used platforms, the regression analysis suggests that they were the most beneficial. ERP use was significantly correlated with realized benefits in customer service, demand planning, risk management, inventory management, visibility, and increased profitability. Business intelligence users reported the greatest levels of performance improvement in customer service, supply chain visibility, end-to-end supply chain collaboration, and overall productivity. These results seem to demonstrate that to make big data analysis work, companies need to be using the right tools.
Although most of the survey respondents have only implemented the less sophisticated types of analytics, they are still hopeful about the results that they are going to achieve. Companies reported that they expect moderately significant improvements in customer service, supply chain visibility, productivity, and profitability in the next 12 months. However, only 40 percent of survey respondents are planning to make a moderate to very large investment in cognitive analytics in the next 12 months, and only 52 percent plan to make a significant investment in predictive analytics. In contrast, almost two-thirds (63 percent) of survey respondents plan to invest in descriptive analytics.
The 2018 survey results point to a disconnect between reality and anticipated results. Companies seem to be focusing their investments in baseline solutions but are hoping for results that are correlated to more sophisticated solutions.
In spite of these discouraging signs, there is hope. Survey respondents generally reported facing fewer roadblocks to implementing big data analytics than they did last year. On a seven-point scale, all of the potential impediments that we asked about dropped from an average score between 5 (moderately significant) and 4 (neither significant nor insignificant) to scores between 4 and 3 (moderately insignificant). (See Figure 6.)
After analyzing the 2017 data versus the 2018 data, we determined that managerial support was the key difference between this year and last year. Companies that had strong managerial support did not see getting their firms to invest in additional software or hardware or understanding the value proposition as significant barriers to big data analytics implementation. They also did not believe that concerns about security risks posed an impediment to adoption. However, they did still believe that talent acquisition, integrating siloed legacy systems, and gaining competency in new tools were significant barriers to further implementations.
The fact that more companies are reporting having the support of top management is an encouraging sign, as this is essential for continuing to push successful big data analytics initiatives when they face difficulties or missteps. It also ensures that the implementations will receive the funding they need to gain the required tools, staff, and training.
The road ahead
It would be easy to look at the results from this year's survey and feel discouraged. While the number of companies implementing big data analytics programs has increased, their satisfaction with the data they are working with and with the overall results of these initiatives has dropped. Companies seem frustrated with their lack of progress and return on investment.
However, this feeling of frustration and even disillusionment is not surprising. It is actually fairly typical for the implementation of any new technology, as described by the analyst group Gartner in its "Hype Cycle." Many technology experts have observed that when a new technology is introduced there is often a period of increasing hype and building expectations. When the technology does not initially live up to those overinflated expectations, there is often a period that Gartner calls "the trough of disillusionment." During this period, companies' expectations for the technology drop rapidly as they go through the pain of actual implementation. Then, as companies discover what the technology can actually do, expectations rise again. During that time, companies regain some of the expectations that they lost in the trough of disillusionment, but they never again reach the height seen during the start of the hype cycle. The pain that many firms seem to be experiencing with their big data analytics implementations is a normal, and often essential, part of the maturity process.
It's possible that big data analytics has entered that "trough of disillusionment," as companies realize how much work they need to do to clean up their data and expand their use of additional types of analytics. To get reliable and trusted data, companies need to have not just a central data repository, but also an executive-sponsored, cross-organizational approach to data collection and analysis. This process needs to factor in accountability, repeatability, and subject-matter validation. This takes time and work. To help guide companies out of this "trough of disillusionment," see the sidebar on "Suggested best practices."
Companies are also in the process of learning that the analytical tools they are currently using are not sophisticated enough to provide them with the timely, accurate, and specific insights needed to make smart decisions. Companies need solutions that can quickly answer the following questions: What happened? Why did it happen? If we make this change, what could happen? These tools also need to be able to track the financial and operational impacts of the changes that were made. These types of solutions are out there, but companies will need to make investments in new software and address the change management issues of fully utilizing the insights gained. They will also need to invest in their employees to make sure they have the skill sets required to work with these systems.
While the initial shine and optimism might have worn off big data analytics, companies have a more realistic assessment of the road ahead. They more clearly understand the gaps they face in data quality and accessibility and the limitations of their existing tools. But big data analytics, combined with artificial intelligence methods, still holds great promise for the future. With the right level of executive support and investment, companies can reach the point where they have the valid data that they need for analytics tools to make better decisions more efficiently. But until then, companies must prepare themselves to work their way through a period of uncomfortable growing pains.
1. Regression analysis is a set of statistical processes for estimating the relationship among variables. For example, regression analysis could be used to establish whether there is a relationship between respondents who use descriptive analytics and those who reported improvements in customer service.
The ability to gain the most value from big data analytics is a journey. You can accelerate your company's progress by following proven best practices that address the people involved, the processes and data employed, and the technology utilized.