Regardless of how it gets done, an emphasis on high-level data quality and management needs attention from both shippers and providers.
By Jeff Berman, Logistics Management Group News Editor ·
Roughly eight years ago, I wrote a story based on a report by Adrian Gonzalez, president at Adelante SCM and Founder/Host of Talking Logistics, based on a report he wrote while at ARC Advisory Group, entitled “The Next Big Thing in Logistics.”
At the outset of the report, Gonzalez was blunt in explaining that when it comes to defining what exactly the next big thing in logistics was, his simple answer was: “I don’t know.”
But one thing he did know that was stressed in the report was the need for some sort of “emergence of standards-based logistics communication and process execution networks-a fancy way of describing the ‘logistics utilities’ that virtually every company uses to exchange electronic information and exchange business processes with their trading partners, customs, and regulatory agencies.”
Essentially, this was heeding the call for a better way to collect and clean logistics-related data that was and still is needed by shippers. But he cautioned that it was not wildly popular at that time, due to its difficult and time-consuming processes. What’s more, he said it was in dire need of being addresses by shippers and 3PLs in a standardized way that could make it “the next big thing” in logistics technology, although it was not well prioritized by supply chain and logistics stakeholders.
Gonzalez said taking a step back in time with this report made for some interesting reflections on the topic again.
“When I wrote that report, we were still in the early stages of supply chain operating networks, and the idea was about the concept of trading partner connectivity becoming a utility as one of the key developments in helping companies with data quality management problems,” he said. “The challenge many companies still have today is there is still no real connectivity business. They still build and own EDI networks, create their own private point-to-point data transfers between trading partners, and the reality is that they should get out of that connectivity business, because it is not a core strength for them. This is highlighted by poor data quality.”
And this is where poor data quality gets overlooked and lost in the shuffle, with corporate operations groups saying it is a problem for IT, and IT say that it falls on operations, as that data is replete with information from carriers, suppliers, and customers not sending clean data.
And networks have gotten bigger over time, with more than eight years of the evolution of the supply chain network model, which he said has proven to be sustainable and continues to grow, coupled with shippers “finally recognizing the value of B2B connectivity,” which was not as prevalent eight years ago.
“It is not just about software anymore, as connectivity needs more attention, and there have been a number of related acquisitions along those lines like SAP acquiring Ariba, in that case a software company seeing the value of a network-based business, as well as other software and B2B connectivity companies coming together,” he said.
There is a bit of foreshadowing there, too, with Gonzalez explaining that is something he saw coming in 2007, but what he did not see happening, and is in the very early stages today, is the evolution of these networks going beyond connecting companies and computer systems’ data, or computers talking to computers as an initial focus, coupled with what he termed as social networking abilities to help people communicate with people in a more collaborative and efficient manner than phone calls and meetings.
What has not changed, though, is that data quality and management is still a big problem and challenge.
“Even though the onus is shifting a bit towards the supply chain operating network in that this should be a utility instead of users creating hundreds of thousands of connections to plug into a system, the onus is now falling on them to manage data quality, which is a challenge from an integration standpoint, processing and cleansing the data, and is also becoming more challenging with things like the IoT (Internet of Things) and big data as the amount of data being produced continues to explode ,” he said.
Even though it is a challenge, Gonzalez said today’s networks are better equipped to manage and address those challenges as a core business, as opposed to manufacturers and retailers doing it alone as they continue to throw people and money at it.
In summing up where things stand on the logistics data front, Gonzalez said many shippers are focused on the elusive goal of end-to-end supply chain visibility, with one of the challenges is mining that data and connecting all of the various parties that have pieces needed to provide that visibility.
In other words, a platform bringing together those data elements into a network to cleanse the data to provide the needed visibility can go a long way to ensuring a stronger and more fluid supply chain.