Humans vs Machines - On Measuring Ad Viewability
I was taking my casual round around the office when I saw my colleague Peter in a pensive mood staring hard at the computer trying to solve some mystery. One look at him and you’d know he had stumbled upon something interesting.
“It does not make sense”, he said.
“What?”, I asked.
“The viewability, as per Moat, is as low as 14% on Facebook.”
We were implementing a campaign for one of our top FMCG clients and as standard practice we used Moat to track viewabilty. It’d help us keep a check and optimize sites for better delivery. It was a popular belief that facebook wasn’t the best site for video but 14% was a little too low especially on our best performing creative. We had set the campaign for a 10 sec view and the numbers on our analytics dashboard clearly showed a contradictory story to what we gathered from Moat. Something just wasn’t right. So we probed further.
After cross referencing a few data points and making a few calls to the teams at Facebook and Moat we realized the issue was with the data we were comparing. While a Facebook video creative could run on news feed, In-stream and audience network, MOAT can only track video on news feed. Thus the data displayed by Moat was not a true metric to measure the success of this campaign which had close to 70% of impressions getting delivered instream.
Over the last two years the importance of BAV (Brand Safety, AdFraud & Viewability) and conversations focusing on it have gained considerable momentum. It is not surprising that a substantial portion of exposure on digital is driven by fraudulent means. What looks like a million impressions on a single creative could in reality be half or even a quarter of that. Advertisers globally have taken this head-on and found their own ways to track and measure the impact of their online spends keeping standards of BAV in mind.
A study by pixalate shows India to have the highest video adfraud impression rate when compared to any other country
While India has witnessed a certain momentum in tackling this problem, we still lack a strong enough force to drive this change There are two reasons – firstly, lack of client support. Many clients are only married to last-mile-conversion and do not show enough care or even interest in what happens between exposure and conversion. As a fall out, third party cost for tracking BAV is very unfortunately viewed as an expense instead of an investment. Secondly, the absence of an authority to set standards and hold publishers accountable to comply to them.
Our analysis over multiple clients across industries shows that simply measuring and optimizing brand-led campaigns on three simple parameters helps drive better efficiency. This far outweighs the investment costs that may go into such measurement and optimization itself. The three parameters are:
1) Whether the ad has been seen by humans
2) Whether it has reached the right audience
3) And whether consumers find the communication interesting enough to engage in
By optimizing our campaigns across these three parameters we have seen both brand scores and last-mile-conversions substantially improve. The most critical elements are eliminating the possibility of an adfraud and ensuring higher viewability – however, just deploying tools and blindly accepting what they report does not help. It is always good to have a Peter standing vigilant, watching the data points shared and questioning every assumption being made. After all, it is only us human’s that can question the machines, and one wrong attribution is all it takes to question the existence of us planners!
Marketing and media professional with 12 yrs of experience
7 年Nice one Vishal Jacob
IIM Nagpur- Brand, Marketing, E-Commerce, Games, Animation, Analytics & Digital Strategy. Ex - KPMG | Group M | Sony Pictures Network | Times Of India.
7 年Good one VJ.. the ability to question and challenge remains with us... We should never loose it...
Country Manager – India, Appnext
7 年Good read!