Velocity of Data - Part 1

Lately, I’ve been thinking about the problem of velocity of data. If you do some research, you’ll run across 3 or 4 V’s of data, but I believe that zeroing in on the velocity of data can provide insight into the value realization of data by corporations. The intention here is not a full-blown whitepaper on the topic but rather something to jumpstart a conversation around the value of data velocity for corporations. Think of this as Part 1 of some starter thoughts.


Equation and Quantification

The first question is whether or not data velocity can be quantified. I believe that the equation from economic monetary theory for money velocity offers a starter. That equation is: V =     PT/M.

Where V = velocity, P = price of a market basket, T = transaction volume, and M = money supply.


Looking at this from a data perspective, I would hypothesize that P could be a monetization estimate of data within a corporation. As technology adoption increases and data monetization is more fully achieved, then P would increase. I would hypothesize that T could be more directly measured through query monitoring and finally M would be really “D” in this case where it is the size of the data universe for the corporation.


While this equation may offer a starting place or perhaps proxy to calculate the velocity of data, it does open up the issue of data growth’s impact on the velocity of data. That is, as data within a corporation grows, without intervention of techniques to improve data accessibility to drive value and transactions, the data velocity would decrease by definition.


Emerging Trends

An interesting emerging trend is the use of cognitive services and AI to deepen searches across vast data environments to find data and information. This points to the fact that corporations are quickly realizing that to unlock data value for their corporation, we must go beyond the textual searches of today.


With the trend toward hot, warm, and cold data, one would theorize that these deeper search capabilities may stay at work beyond the immediate search performed by a user and continue combing the data environment after providing initial ‘hot’ responses so that they user can get a bigger return of data and information. AI has the potential to help the user navigate the larger return of data to find most valuable assets.  I find this very intriguing as the data volumes are growing so exponentially.


Next Steps

I’m a believer in not re-creating the wheel unless necessary. It appears that that the equation for the velocity of money from monetary theory offers a jumpstart for the beginning of the quantification of the velocity of data. I think additional thought is needed in how the velocity of money equation does or does not fit the data use case. If you would like to be a part of this discussion, reach out and let me know.

Ed Parker

Career Strategy Expert | Build A Career That You Will Love Again | 20+ Years Tech Leadership | Business Coach

7 年

Thanks Dan. Yes, I believe that velocity of data points to maturity of the analytics (BI, AI, ML, etc.) within an organization. I think one of the troubling things the first part uncovered was that with the exponentially growing volume of data, the velocity will decrease relative to volume unless companies are proactive in leveraging the data and getting it to the people in their organization in the form of information so that they can make fact based decisions. I'll dig a little more in Part 2 in the coming weeks.

回复
Dan Rubin

Professional Services Executive ? Certified Motorcycle RiderCoach

7 年

Interesting thoughts Ed. Certainly there is an avalanche of data that is both structured and unstructured coming into every organization today. Connecting all of that data is the challenge that must be overcome in order to extract real value. I apply the term "Velocity" to the speed in which solutions in this space are moving as well, and Customer Data Platforms are at the front of that wave. Thanks!

回复

要查看或添加评论,请登录

Ed Parker的更多文章

社区洞察

其他会员也浏览了