Data Integrity Drives Operational Excellence in the Supply Chain

Data Integrity Drives Operational Excellence in the Supply Chain

Data quality and transparency determine success or setbacks

In a supply chain, every decision depends on data, so data integrity is critical. Accurate and timely data ensures supply chain partners hit their targets, comply with internal processes, make informed decisions and streamline processes while controlling costs.

“Accurate and timely data allows you to make educated and smart decisions about your business. If the data is inaccurate, it portrays a different truth than the reality. In some cases, this can have a dramatic impact on results,” said Vishwa Ram, vice president of data science and analytics at Penske Logistics.

The Importance of Data Quality and Monitoring Tools

In the 2024 Annual Third-Party Logistics Study , more than half of respondents — 57% of shippers and 52% of third-party logistics providers — said they’ve experienced issues with data quality. Managing data is especially important for logistics providers, who often draw on their customers’ data but are significantly impacted if it’s incorrect.

The report found that third-party logistics (3PL) providers are more likely than shippers to leverage data monitoring tools, use IT staff to check and resolve data quality concerns and have built-in data monitoring capabilities that automatically detect and notify stakeholders of data quality issues.

Potential Setbacks in Data Quality

Without accurate and transparent data, a business can experience any of the following, which creates more work to resolve.

Lack of Trust

Errors and missing or inaccurate data all harm data integrity, and one of the most obvious identifiers of bad data is a lack of trust. “If you do not have confidence in the data and information generated from the system of record to make business decisions, then you probably have a data integrity issue,” said Rowland Myers, vice president of DCC strategy and support services at Penske Logistics.

Human Error

Penske Logistics has established key measurements to ensure data accuracy and has several methods to verify data, depending on the source. Myers said the validation process starts at the beginning. “From an accuracy standpoint, we can confirm that it was put into the source system correctly,” he explained, adding that some of the biggest challenges center around areas where human touchpoints are needed. “If a driver has to punch a data point into the phone, there is an opportunity for error.”

For example, if a driver needs to hit a button on their phone when they arrive at a location but fails to do so until well after their arrival time, it creates inaccurate information. “If something shows it is out of tolerance, it will flag it, and we can find out what happened. In some cases, the solution is better training to create habits that drive accuracy and timeliness,” Myers said.

Ram said machine data capture is always more accurate than human data capture. “We’ve done a number of things to turn as much of our data capture as possible into machine capture, but we’re always going to have humans involved, so we do have to focus on the human element,” he added.

Missing Data Elements

With information transmitted via electronic data interchange (EDI), Penske has created automatic message processing and business rules. “We look for missing data elements and flag those,” Ram said. “In some cases, we go back to the customer. In others, we are reviewing what the data should be.”

Missing data — a misalignment between what is needed and what is available — can be a process error or a case of something not being captured. “For example, if a customer doesn't provide data, it is hard for us to give them an accurate analysis on the cube of their trailer,” Myers said. “We may have to go back and understand the requirements to start tracking additional information, or we can quickly show how to improve the data through process rigor or additional training.”

Automation can also improve data timeliness, which is critical in supply chains. “If you don’t get it in real time, it loses its impact,” Myers said. “In our business, on time is one of the most sought-after compliance metrics to make sure we’re getting products from A to B. If we don’t have the timely data, you don’t have the visibility to give the customer information.”

Delay in Product Movement

More importantly, real-time data allows logistics providers to mitigate risk and keep products moving . “We want to be proactive and let the data predict an issue before it happens,” Myers said.

It isn’t enough to just capture information. It must also be transmitted to everyone who needs it. “If somebody is sitting in LA traffic and is going to be late in Denver, capturing that information in real time, sharing it to our various systems and sending a message to a customer isn’t something that is trivial,” Ram stated. “Even though we as a society have come to expect these things, it still remains a complex endeavor. We have invested millions of dollars in the right systems, architecture and analytics to make it all happen.”

Data Integration Barriers

There are countless data points in today’s operating environment, which is why architecture is critical. “We have to funnel that information into a single place accessible to all parties to get the single source of truth,” Ram said. “We have broken many of the integration barriers and integrated with a lot of vendors and technology providers over the years.”

According to the 2024 Annual Third-Party Logistics Study , integration barriers are among the top challenges shippers and 3PLs face when sharing data. “A lot of our customers have had those challenges, and one of the reasons they come to us is because we have expertise in integrations,” Ram said.

The amount of data generated within the supply chain continues to increase, and future success depends on having the right platform to absorb all data sources, including streaming data, which is also on the rise. “Getting streaming data is not easy if we’re not architected correctly. That is how we’re making systems future-proof,” Ram said.

The Future of Data Integrity

As technology changes, it will be necessary for systems to identify which data is human-generated and which data points are generated by artificial intelligence. “We also have a lot of data in an unstructured format. Being able to capture that and generate meaningful insights from it is a huge undertaking. From a technical standpoint, that is different from what we’re used to,” Ram said. “The key lies in developing an overarching strategy to integrate our processes and technical capabilities to unlock business value.”

Carlo D'Amico

Freight Operations Management | Logistics Management | Operational Excellence | Shipping | Transportation | Account Management | Customer Experience | #DareToImprove

7 个月

Insightful post from Penske Logistics on the importance of data integrity in supply chains. Data quality significantly impacts operations and transparency. Key strategies include reducing human error and using automation for data capture. As reliance on real-time data grows, robust frameworks are crucial to avoid costly mistakes. #Logistics #Shipping #DigitalTransformation

回复
Charles Mitchell

Operational Excellence | Manufacturing | Distribution | Supply Chain | Lean | Veteran | Leadership | Global Multi-Site | Safety |

7 个月

Great article on managing internal and external data in daily operations.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了