Humans, check. Attentiveness, check. Are you ready?
What happens if your media is already well optimized and your FouAnalytics in-ad charts are showing minimal dark red (bots) and lots of dark blue (humans) like the one below?
1. example of a campaign well managed by Goodway Group Jay Friedman Andrea Kwiatek Laura Taylor Allison F. and team
2. example of a campaign well managed by Brunner Luca Pugliano Ivan Tafur
Advertisers who have optimized their ads already can move on to the next more advanced step -- optimizing for attentiveness of the human users who click through to the landing pages of clients. Let's see how...
Viewability vs Attention vs Attentiveness
Viewability has to do with whether the ad itself has an opportunity to be seen -- 50% of its pixels in the viewable portion of the screen. But it does not report whether it was actually seen by anyone.
Attention has to do with whether users were paying attention to the screen. Eye tracking studies by attention vendors showed that if a user were looking at the screen, a larger ad that covered the screen got more "attention" than a smaller ad that could be easily ignored. Ads with greater attention work better. That makes total sense. But, current attention vendors have no way to measure whether any users were actually looking at the screen. Let me repeat this. Current attention vendors and the media agencies that buy based on attention have no way to detect whether someone was actually looking at the screen (because javascript detection tags are not allowed to turn on the camera on a laptop or smartphone). So they can't detect whether someone was actually looking at the screen. Eye tracking studies done in labs are fine and good, but campaigns bought on attention can't actually measure whether users were paying attention.
Attentiveness. Advanced advertisers and their agencies use attentiveness by FouAnalytics because it CAN be measured directly. Attentiveness is measured on the landing pages of the advertiser, NOT measured in the ads themselves. Attentiveness means the user did something ELSE on the landing page after they arrived -- for example moved the mouse, scrolled the page, clicked something, touched the screen on their mobile device, etc. Some might call this the opposite of "high bounce" visitors.
Most advertisers have observed in their own site analytics users that bounced right away or left after less than a second. Those high-bounce and low time-on-site visitors usually came from programmatic media sources and were likely just bots. Bots don't stick around on the landing pages; they leave right away so they can move on to the next fake ad and fake click. Most of you have seen the slide below, where the clicks from 15 different programmatic campaigns were 96 - 99% bots (orange and red) and only 1 - 4% humans (dark blue). This helps to explain the high bounce and low time-on-site observed in Google Analytics. GA doesn't have this kind of color coding, like FouAnalytics, so it was previously difficult to understand why.
More attentive humans leads to more conversions
If you've already optimized your media, like the example at the top of this article, and avoided most of the bot clicks, like those in the 15 orange and red donut charts, you can study the attentiveness of the users on the advertisers' landing pages to further optimize for more attentive humans. Below are 3 examples.
Clicks from Google paid search
When you isolate UTM_SOURCE=google (paid search) you can first see there's a lot of dark blue (human) clicks that arrived on the site. And just a small amount of dark red (bots) - 7% (donut chart below). To the right of the donut chart you see the graphs in the Clicks tab of FouAnalytics. Keep in mind this is on-site measurement on the landing pages of the advertiser.
Note each of the charts on the right is a single screen resolution; we group the clicks by screen resolution so the relative positions on screen are accurate. The yellow highlight shows you that 73 - 79% of the users went on and clicked something on the landing page, after they arrived from clicking on a google search ad. Note further that there were no red clicks in these charts. Bots were not faking the clicks, and only humans went on and did something else on the landing page. Obviously humans have to do something on the landing page before they can complete a purchase, for example.
领英推荐
But even if we can't see the purchase or conversion events on the site, the RELATIVE attentiveness of users from various paid media sources can tell us where to invest more money, and where to reduce budget -- i.e. invest more in sources driving more attentive humans. You will even notice that certain ad creatives lead to more attentive humans on the landing pages.
Clicks from Tiktok paid campaign
See the second example below where UTM_SOURCE=tiktok. If you look closely, you will see virtually no clicks on the landing page. That means the users didn't do anything else after arriving on the landing page. The green highlights show 0 - 2% did something like click. That makes sense, given the large amount of dark red (bots) that clicked through. Bots don't waste their time doing something like clicking around the landing page if they are not paid to do that. P.S. the client told us they had not turned off Pangle, the audience extension of TikTok.
Clicks from PMAX campaign
The third and final example below is UTM_MEDIUM=pmax. Keri Thomas wrote a whole article on the crap we saw from PMAX here - https://www.dhirubhai.net/pulse/pmax-black-box-you-dont-want-anything-tpibc but we will just focus on the attentiveness click maps for now. Note the yellow highlight says 81 - 83% of the "users" clicked something on the landing page. But were those clicks real? Nope. Not only does the color coding (dark red) give it away, you can see the click locations also indicate bot activity. Larger red circles means repeated clicks on the exact same x,y coordinate/pixel on screen. You can't get a whole bunch of humans to click the exact same pixel on screen.
This was a case of bots faking the clicks on the landing page in order to trick Google Analytics into reporting LOWER bounce rates. To Google Analytics, these users didn't bounce because they did something on the page (faked click). But with the details in FouAnalytics you won't be tricked because you can see the clicks look like bot clicks, not human clicks. The slide below shows what human clicks look like on desktop. Most of the clicks are individual points. But there are some clusters of clicks because humans have to click site navigation, links, buttons, etc.
So what?
First order of business is to optimize your ads. Use FouAnalytics in-ad tags to measure where your ads are going and whether they were loaded by bots, other forms of fraud, or humans. Once you have optimized your media, so that most of your ads are shown to humans, you can then look at the clicks that arrive on the landing pages and the relative attentiveness of those humans that arrived. More attentive humans lead to more outcomes. Even if most of the clicks are dark blue (humans, check), the more advanced advertisers can further optimize for more attentive humans (attentiveness, check).
See the last section of this article, for the comparison of attentiveness of users from different digital channels: FouAnalytics Attentiveness versus Attention
And attentiveness can even be used to gauge the relative effectiveness of offline media like TV or billboards, or media that don't have click throughs, like CTV. See: How to do cookieless attribution with FouAnalytics
If you've already optimized your media, and are ready to do more advanced work with attentiveness, reach out.
Regional Business Director ???????? | Co-Owner | Creative + Media | Online Advertising Agency
8 个月We've been using custom 'quality visit' tags to try and optimise towards the attentive audience you describe. Usually it's a mix of scroll depth, dwell time, and pages visited, and it definitely helps to improve traffic quality on non-purchase campaigns. I think GA4 also tracks this now, so there may be a native 'quality visit' objective in Google, but I'll need to double-check that. Great breakdown and well done to the Goodway Group on those human stats!
Director, Integrated Planning and Mass Media at Brunner
8 个月Thanks for the shout out Dr. Augustine Fou. Funny how humans behave in one way and bots in another. Measuring attentiveness not only drives down bot traffic even more, but let's us see what channels/tactics drive the best human response.