Approximating the Truth

Approximating the Truth

Originally Published in The Huffington Post with Jeremy Rabson and Kevin McDermott - October 20, 2015 - https://huff.to/1RthT37

Living in the information age is amazing. The difficulty is that we can't always know what actually counts as information.

Like TV in the video age, digital tools spread so rapidly and so pervasively that it's hard to imagine a time when they didn't exist. The Internet as a consumer technology is barely 20 years old. Its conventions are still being established. In the meantime we port over the conventions of previous human behavior that don't always apply -- the same as we did with television.

It has become commonplace in the digital world, for example, to talk about the wisdom of the crowd--the aggregated opinions of everybody. The problem is that the crowd isn't everybody. The crowd is pockets of interest, loud voices, competing truths.

Walk among your Facebook friends and ask for opinions about a car you're thinking of buying. You will know the lenses through which they see things, providing context for their opinions. This isn't true of Internet crowds. That's more like wandering into a cocktail party and not knowing who's there.

Consider the phenomenon of review sites like Yelp and TripAdvisor, where the same restaurant can get different reviews depending on which site you check. Why? Make a guess. The danger is that people think these opinion-aggregating sites are reporting something factual. This danger is magnified when we lend such crowd-sourced wisdom to investments, markets and politics.

Marketers have begun to wonder, for example, if online surveys are in fact a case of garbage in and garbage out. Pollsters have a similar problem trusting the universes they canvass. The statistician Nate Silver, meanwhile, has built a reputation challenging the conventional pundits who allude to squishy kinds of "data" in the opinions they peddle.

This is not a new problem. Advertisers, for instance, have known for two generations that Nielsen TV ratings are at best a good guess about viewership. And still zillions of dollars change hands on the basis of what Nielsen reports.

A wise crowd. A wise crowd has scale and full participation. How do you tell when that's true of a digital crowd?

Our first impulse is to get a sense of who is in the crowd by looking for markers. Do the voices in it seem to value the same things, use the same measures we do -- do they reflect our truth, in other words? Too often we seize on screwy things like shared cultural references or, worse, pay most attention to the loudest voice. We make these judgments in an instant without even being aware we're doing it.

Or take sites like Amazon and Yelp which spotlight "top reviewers." Most likely a top reviewer is someone with a lot of opinions who collects virtual badges of honor (and has too much time on their hands). One study found that residents of Maine and Vermont, for some reason, are represented among Amazon's top reviewers at rates more than three times their percentage of the U.S. population. In such a system the risk is that, in the aggregate, reviews will skew to the noisy minority.

People tend to express opinion online if they love something or hate it. (That's ouropinion, by the way. Better check for yourself.) Ever wonder why some restaurants get lots of five stars and ones, but not many twos, threes and fours? We all sort of know this, so we end up picking and choosing what we want to believe is true.

Radical autonomy. The digital life promotes radical autonomy. That means we need new skills and new habits of mind for sifting the quality of digital information. In every case we need to ask what it means to go deeper and be our own data miners. We need to be empirical.

Google, for example, will give you information about who's searching for information about the flu and where the searches are concentrated. That tells a lot more than a few dozen people who go on Facebook to remark that their kid is home from school and there seems to be flu going around.

Instead of aggregating reviews for a restaurant and handing out stars the way Yelp does it would be more revealing to see empirical performance data about the place.OpenTable, for example, knows the demand for tables at every one of the restaurants in its database. If we want to know whether people are voting with their feet we could find out what percentage of tables are full every night and call up a trend line. That might predict a lot more about your enjoyment than the opinions of 23 reviewers who give the place an aggregate 3.4 stars.

If demand grows for more rigor products will respond. We'll get better -- or our tools will -- at pattern recognition, rather the way Netflix uses an algorithm to identify people with similar tastes -- not by asking but by parsing real data.

In the meantime, better than a "top reviewer's" opinion is the structure of an Amazon product page, which tells users what other shoppers interested in the same product looked at and eventually bought. That's based on behavior, not opinion. Imagine if Amazon also shared the return rate on every product it sold -- and the reasons for returns. Wouldn't that be more useful than the opinions of anonymous reviewers motivated who knows why?

The best approach is not cynicism but skepticism. Consider your source. Who sponsors the thing? Who does it attract? Learn to triangulate sources, just as sophisticated consumers of news read about the same story from several outlets to understand an event in three dimensions.

Most of us are still amateurs at making sense of information (unlike the professionals who know good data from bad and capitalize on that difference). Given a little more time living in the digital world we'll all get better at knowing fact from hot air.

Andre Dore

Financial Services Professional

9 年

As a business owner who uses Google Adwords and FB sponsored ads, i can't express how frustrating it is to interpret the metrics used and data received from them.

Joe McCambley

Chief Marketing Officer at Saatva

9 年

Daryl this is an excellent, enlightening, educational piece of writing. Thank you.

Susan Shwartz, PhD

Financial writer and SF novelist RET.

9 年

Last point (aren't you glad you don't have to cut my stuff anymore?). The squishy logic. Having been on the Net since 1989 and participated in online communities, I know it's possible to "get" how a group feels and reacts. Because of my background in Science Fiction, I refer to this as the groupmind, as do a number of long-time users, more or less privately. I can best describe this by analogy with live audiences for theatre or opera. You KNOW when the audience has been pulled in. You can feel it. You can see it. If you can distract yourself from the play or opera long enough, you'll see people leaning forward intently. They won't move much. They aren't just silent (as they damn well should be); they're intent. And the proof comes at the end, when they are poised to spring, do leap to their feet and begin clapping and screaming. If theyve really been captured, they'll pause before they scream and leap. Knowing one's own reaction to being part of a group like this is what "groupmind" behavior feels like. Yes, I know it sounds like a hivemind. It isn't: it's based on experience, knowledge and almost subconscious process of multiple -- and I do mean multiple -- inputs. You know where I leave. Let me know if you want -- heaven help you -- to hear more.

Susan Shwartz, PhD

Financial writer and SF novelist RET.

9 年

Page 2. You know from working with me that I am NOT a quant, but that I am very good at squishy logic and extremely experienced on the Net. You also know that when I read a review, I am not reading just what the reviews say about the restaurant, but what they betray about the reviewers. How do I evaluate? "Netspeak" and improper ordinary English usage; tone of language; comments betraying resentment; the sort of hostility to staff that reveals discourtesy on the part of the reviewer all can cause me to dismiss a reviewer. So do unrealistic expectations of price. What makes me value a review and reviewer? The reviewer writes well enough to make me understand what s/he values and whether s/he received value. The reviewer is specific. The reviewer is objective. The reviewer isn't afraid to condemn ("we won't be going back") or rave ("You did a wonderful job for my XYZ's WHATEVER party. Thank you"). If the reviewer mentions names, I respect his or her courtesy with staff. Does THAT reviewer like or dislike the place? THAT reviewer is a determinant.

回复
Susan Shwartz, PhD

Financial writer and SF novelist RET.

9 年

Your fascinating article made me realize that many of the purchasing decisions I make online are based on a decision process that is subliminal as it is subconscious. This process is based on a lifetime of rigorous critical thinking and honed by an awareness of net culture that goes back to 1989. One simple question: Would I check out the restaurant evaluated on those four websites? I would. Here's why.Even without looking at N, I'd discount much of Yelp because many participants do Yelp. They seem a hostile, demanding crowd with unrealistic expectations of price, value and service. Facebook: much larger N and skewed toward the outcome I want -- people love it. I've just recently become active there. I don't know the culture well, but that N is tempting. Tripadvisor: I read it, but don't participate. I tend to think of it as more cautious -- out of towners who take fewer risks. I do participate actively on OpenTable: small N, but because I know OpenTablel, I would read each review carefully. The reviews there can be wildly enthusiastic or critical, but many are knowledgeable. Above all, 3 out of 4 of the sites are in a close consensus. I'd "Guesstimate" that 4.0 to 4.4 our of 5.0 as the consensus and look for outlying phenomena -- type of food, decor, etc. But I would suggest it to my usual group of people who checks out places. I might even try the dishes suggested.

回复

要查看或添加评论,请登录

Daryl Twitchell的更多文章

  • What does it mean to "consult"?

    What does it mean to "consult"?

    Authored with Kevin McDermott. ?Not long ago we interviewed someone we were thinking of adding to our consulting roster.

  • Feeling emotional? Fine, but don’t forget your intent.

    Feeling emotional? Fine, but don’t forget your intent.

    Authored with Kevin McDermott. As professionals we’re constantly reminded not to get emotional.

  • I hate this job. I need this job.

    I hate this job. I need this job.

    Originally Published in The Huffington Post with Jeremy Rabson and Kevin McDermott - December 22, 2017 -…

    1 条评论
  • How to Lead a Team and Not be a Jerk

    How to Lead a Team and Not be a Jerk

    Originally Published in The Huffington Post with Jeremy Rabson and Kevin McDermott - July 27, 2017 -…

    1 条评论
  • Favoring the Prepared Mind

    Favoring the Prepared Mind

    Originally Published in The Huffington Post with Jeremy Rabson and Kevin McDermott - May 5, 2017 -…

    2 条评论
  • Data Isn’t Truth. It’s Signal.

    Data Isn’t Truth. It’s Signal.

    Originally Published in The Huffington Post with Jeremy Rabson and Kevin McDermott - December 14, 2016 -…

    2 条评论
  • Why Does Choice Make Us Unhappy?

    Why Does Choice Make Us Unhappy?

    Originally Published in The Huffington Post with Jeremy Rabson and Kevin McDermott - September 22, 2016 -…

    2 条评论
  • The End of Diversity

    The End of Diversity

    Originally Published in The Huffington Post with Jeremy Rabson and Kevin McDermott - July 1, 2016 -…

  • Tesla's Gigantic Concept Test

    Tesla's Gigantic Concept Test

    Originally Published in The Huffington Post with Jeremy Rabson and Kevin McDermott - April 28, 2016 -…

    1 条评论
  • A Picture's Worth a Thousand ... What?

    A Picture's Worth a Thousand ... What?

    Originally Published in The Huffington Post with Jeremy Rabson and Kevin McDermott - February 29, 2016 -…

    1 条评论

社区洞察