DATA DRIVEN SERVICING

DATA DRIVEN SERVICING

Nirvana. Well….at least from a field services perspective. Using the combination of environmental data, weather forecasts, sensor data and good old-fashioned calendars of events & works combined with real-time traffic information to prioritise and assign works.

So how achievable is this Nirvana and what steps do we need to take in order to get there?

In terms of having systems capable of directing a workforce to deal with jobs in the field, we are already there – with any number of systems available that can lay claim to being able to do this with ease.

Weather forecasts over larger areas are reasonably accurate – although the increasingly violent weather patterns being experienced do mean that the predictability of weather events may drop off in the future, but granular (suburb by suburb) forecasts, especially when looking past a 48-hour window are always a bit sketchy – both in the reliability and the availability of the data itself. All of that being said, the data is freely available and there’s no reason you can’t rate the probability based on the location and forward looking– so all-in-all doable.

Real-time traffic data is now not an issue with Google, Bing and a number of other providers offering reasonable commercial subscriptions providing you are capable of integrating well and can reasonably limit the number of queries per month.

So that leaves us with environmental data, sensors and calendars of events and works…and this is where things get a bit trickier. The issue isn’t the difficulty of interpreting the data necessarily, its actually the issue of locating it, cleaning it for purpose and then using it in a way that delivers a benefit that pays for the effort. I’ll explain...

Dial before you Dig is an excellent example. An absolutely critical service for anyone who wants to dig a hole in the public realm. The service is free to use, incorporates responses from telco’s water, power, gas and local authorities. However, they simply act as a co-ordinating body, with the providers themselves giving the answers to the requestor – which essentially means there isn’t an API you can hook into to get a response (right now).

You can build a solution yourself to automate queries and deal with the responses, scraping e-mails, but its not exactly the most efficient or reliable way to deal with the issue (especially if all you get is "come to office" notifications to obtain plans - which is a common response).

There are other examples, largely about trying to gain access to sensor Data and sometimes more importantly the Metadata around it to be able to interpret it, but I'll leave that for another day.

Assuming you have access to all the Data you would like to be able to intelligently drive your servicing program and build your Digital Servicing Twin then you will need to not only demonstrate significant efficiencies through productivity and asset utilisation within your organisation to ideally off-set the implementation costs but also the ongoing costs related to data feeds, storage and processing.

The problem is that while the efficiencies the solution drives do deliver productivity benefits for the service provider, they may also reduce the levels of billable service. That the service itself is more efficient and conducted more sustainably while delivering better outcomes is a given - but if it results in less income, what is the driver to invest in ongoing subscriptions, more sensor deployments and continual improvement in Data Driven Servicing?

I think that to progress meaningfully in this space beyond the first stage is going to require a collaborative effort from all concerned (Utilities, LGA's, Service Providers) with Data Sharing Frameworks and common Data Standards in order to be able to easily, reliably and cheaply exchange and consume Data....and of equal importance, a different financial model that focusses on rewarding better outcomes rather than rate-cards....so we're back to Nirvana again :)

What do you think??

要查看或添加评论,请登录

社区洞察

其他会员也浏览了