5 Lessons Learned From My Big Research Project

5 Lessons Learned From My Big Research Project

Last week, after more than a month of hard work, I released my research report “Influencer Marketing In A COVID-19 Era.”

If you haven’t had a chance to read the report, go to my profile, find the “featured” section, and the report is the first thing that pops up. If you don’t have the time to read a 30-plus-page report, I also put together a TL:DR deck that goes over my findings. Seriously, it’s great stuff.

This was a big professional accomplishment for me. I’ve participated in other research projects before. I’ve written white papers and trend reports, even a short e-book. I’ve designed a few surveys and crunched lots and lots of data in business school. I’m doing a couple segmentation projects in my consulting work. But this is the first time I led a research project from conception to completion.

I took on this research project because I wanted to strengthen my qualitative and quantitative research skills (and because I was frustrated at getting rejected for jobs that said I needed research experience, but let’s not dwell on that now). Everything – or at least a lot of things – I learned about influencers are in the report.

But along the way I learned a ton about research – more so than any LinkedIn Learning class I could have taken. And what good are lessons if I don’t my knowledge with others?

So what did I learn? Here's a snippet:

Don’t make assumptions.

When designing the quantitative portion of my research, I thought I did a pretty good job. I designed questions that would serve as a strong complement to the qualitative interviews (which, frankly, were the meat and potatoes of the project).   

However, some of the questions left too much to the imagination. For example, the third question in the quantitative portion of my research asked, “About how much time do you spend on social media per day?” I didn’t elaborate on the term “social media” – after all, if you are a mammal answering questions on your phone, you know what social media is, right?

Probably. But does YouTube count as a social media platform? What about WhatsApp? Facebook Messenger? Whether services like these count as social media platforms can significantly sway answers.

(As a side note, I debated for a very long time whether YouTube was a social media platform. In some questions YouTube was included, in others they weren’t. The results were very interesting, so again, check out the report.)

I also asked the question, “By your estimation, which social media platform do you believe is the most important for your peer group?” What does peer group mean? In my head, that meant friend group. But peer group to other people could mean age group. It could mean people in your social class. Simply put, the term “peer group” could have been better defined.

In fairness, it doesn’t necessarily mean the results from these questions were wasted. In fact, I even learned things I wouldn’t by leaving definitions such as “social media” and “peer group” undefined. That said, if I had to do my project all over again, I’d be more specific in a few areas.

Balance structure with having a free-flowing conversation.

In any research project, you want to have a clear methodology. Otherwise, it will be hard to elicit common themes. On the flip side, in qualitative interviews, you also want to have a free-flowing conversation to gain insights you might have otherwise missed.

It’s a difficult balance, and it’s one I got better at as the project went along. I aimed for a 70-30 approach. Seventy percent of the questions were to come from the ones I had predetermined before the interview. I had conversations with three groups of people: marketers, influencers and talent representatives. The same questions were devised for everyone in a particular group, with some questions being asked of multiple groups.

Meanwhile, 30% were spontaneous questions to gather more insight. Maybe it ended up being 65% structured or 75% structured. Who knows, I didn’t keep that close of track. But I learned a lot from both the structured and unstructured questions that I asked. 

Don’t be so rigid that you miss out on interesting conversations. But don’t be so freewheeling that your report lacks credibility.

Older people are more likely to answer online surveys. So are women.

Before the full quantitative survey went out, the research firm I worked with, Dynata, started with a “soft-launch” by sending the questionnaire to a representative sample of Americans. Once 36 results came in, the soft launch was complete.

More than a third of the respondents were age 65+. And 23 of the 36 were women. It’s a small sample, sure, but it was lopsided enough to know that something was up. I asked the folks at Dynata, and yes, older people and women are more likely to answer online surveys.

If anything, I would have though younger people would have been more likely to answer a survey about social media. And the topic of social media seems to be one that would interest one gender over another (compared to a survey about, say, “Attitudes on UFC fighting” or “Attitudes on Nail Salons”). It was a small sample size, but enough to know something was weird. 

I have predictions on why older people are more likely to respond to online surveys. Many of them are retired, and their inboxes are less full of work emails and stuff. Moreover, older people are less likely to have a million social media apps on their phone taking away their attention (my research tells us that older people spend less time on social media and they have fewer social accounts).

As for the women part, I have no idea why they are more likely to respond. I Googled for answers but couldn’t find any concrete reasons. I asked some of my marketing research friends and they didn’t know, either. Seriously, if someone has any idea why this is, please let me know in the comments. I’d greatly value your input. Just know that, in general, women are more likely to answer online surveys. 

Fortunately, Dynata researchers were able to calibrate the responses to ensure we had strong gender and age representation. That would have been harder to do with a do-it-yourself service, especially for someone who isn’t a deeply seasoned quantitative researcher.

Avoid yes or no questions in quantitative research. Or at least minimize them.

When I first designed my quantitative survey, it was full of a lot of yes or no questions. For instance, I asked the question: “Are you familiar with the term “influencer?” which survey takers could respond with either “yes” or “no.”

The fine folks at Dynata told me that would be a bad idea. If a survey is littered with yes or no questions, they may just keep clicking yes until they reach the end – the survey takers get paid regardless of their honesty.

But by requiring respondents to do more thinking, they’re more likely to answer truthfully. So the question was reworded to say “Which of the following social media concepts are you familiar with?” People could choose all that could apply:

- Social networker

- Microblogger

- Photo sharing

- Video sharing

- Influencer

- Troll

- Hashtag

- Unfamiliar with any of these terms

Honestly, I don’t care if people know what a troll or a hashtag is, at least for this assignment. But by wording the question this way, I can more truthfully elicit whether people are familiar with the concept of an influencer. 

I did leave a few “yes or no” questions in the survey if there was no real way around it. But the fewer one has in the survey, the better.

You’ll always wish you had asked questions you didn’t ask. Don’t worry about it and be grateful for the insights you’ve gained.

At heart, I’m a journalist. I’m not one professionally anymore, but I’m very curious, I love asking questions and I love chasing fascinating stories.

In many ways, market research is like a journalism assignment – albeit a structured one. But unlike writing, say, a magazine article, you only get one stab to design questions, poll a sample of Americans, and conduct qualitative interviews.

As someone who’s “trying to figure out what’s next” in his career, I needed to keep costs pretty low. For my budget, I had 15 questions and 300 responses (I only used 14 of them and I got an extra response). So there was only a limited amount of questions I was able to ask.           

I got a lot of great information. But there’s so much I wish I could have asked that I didn’t. For instance, I asked about how much time people spent on social media. I wish I had used that final question allotted to ask how their social media usage had changed in the last six months (since that would have fit nicely with the COVID-theme).

My suggestion? Make sure to do a bit of secondary research on your topic before designing the primary portion. By then you’ll have a better sense of what you need. I did some secondary research, but I was so eager to get into the fun, primary stuff that I perhaps I could have spent a little more time doing the legwork.

But regardless, if you take on any research project of substance, there will be plenty of times you’ll say to yourself, “darn, I wished I had asked that question.” Let it go. You have great stuff to play with!

Any ideas on what my next project should be? I’m taking suggestions below. Also, if you enjoyed what you read, be sure to check out more of my articles:

What Took So Long, Washington Redskins?

Uh Oh! Zuck’s Really Feeling The Heat

Amazon, Attorneys and Antitrust, Oh My!

And seriously, check out my research! You’ll be glad you did!

要查看或添加评论,请登录

社区洞察

其他会员也浏览了