3 questions every PM must answer

3 questions every PM must answer

In 2018, as a new product manager (PM), I was overwhelmed by the seemingly never ending list of activities required to achieve my goals: user & market research, UI/UX design, software architecture, feature specifications, backlog management, roadmap planning and more.

“Where should I start?” was a constant, often paralyzing thought.

With the help of my experiences and mentors, I’ve arrived at a simple framework to guide me: 3 questions every PM must answer

  1. What problem are we solving?
  2. Why are we solving it?
  3. How do we measure success?

I partner with key stakeholders to answer these questions before I write a single feature requirement.

This simple framework creates clarity and alignment for product teams, enabling effective collaboration to solve customer problems while delivering business value.

1. What problem are we solving?

Aligning on the problem is the most important (and at times, most challenging) thing I do.

It's challenging because I have countless stakeholders, each of whom may have a different perspective on what problem should be solved. These stakeholders include designers, engineers, and PMs from both my team and the partner teams who build Microsoft’s 264 (and counting) Azure products.

It's important because a lack of alignment amongst stakeholders can slow - or completely halt - progress. A couple examples:

Scope creep

A comic strip about a sales person who wants to add more features a to a product that is already behind schedule

Imagine you’re struggling to finalize requirements. Why? Partner teams heard about the work you’re doing. They email you:

“Hey! You know, if you just add a few more requirements, you’d cover one of our biggest scenarios and the customer impact would be huge. Look at this data!”

They make a compelling argument. You draft some additional requirements, ask your engineering team for an effort estimate, and work with your designer to generate some new mocks. Rinse and repeat…

Or…

Lack of clarity on what to build

Imagine you’re beginning development on a new feature with a button and your team asks, "how many options are in the drop down?" ??. The team believes a drop down is better than a button, referencing product usage patterns as proof.

You set up another meeting to review customer and market research that supports the button approach. Work begins, albeit delayed, but you have a bigger concern: you sense that the development team lacks confidence in the approach you’ve proposed.

If these things never happen to you, you are awesome at creating clarity ????.

When these things happen to me, I revisit (or sometimes document for the first time ??) my answer to "what problem are we solving?".

Although feature scope and product maturity can affect the way I answer this question, knowing the customer and the problem space is always critical.

Know the customer

Imagine you’re designing an alarm clock. Got it? Good.

Now imagine you’re designing one for people with hearing loss. You're solving a completely different problem. You can’t effectively answer, “what problem are we solving?”, without knowing who has the problem.

Note: Customer and user are written interchangeably to refer to the user of a product.

I employ a combination of user research and in-product data to understand my customers’ characteristics, goals, and pain points.

User Research 

Research specialists guide me in using common frameworks like Jobs to Be Done to generate quality research questions.

I then execute and analyze results from interviews, surveys, and focus groups. Research specialists also help report the findings and statistically significant results in a standard format.

Keeping in mind that research studies only cover a subset of users and users do not always articulate their needs clearly, I combine user research with in-product data for a more complete picture of the user experience.

In-Product Data

Our product captures data on every interaction and surfaces satisfaction prompts to gather feedback from users.

I use database queries and tools like PowerBi to visualize this in-product data. When my analysis reinforces pain points identified during user research, I know we’re on track to identifying the right problem to solve.

One of my biggest failures as a PM resulted from a lack of knowledge about my customers. Knowing the customer ensures that you choose problems to solve that truly matter.

Know the problem space

Even if you’ve chosen the best problem to solve, not knowing about others in the space can damage your credibility.

Have you ever reached the Q&A section of your presentation only to get hit with a question you were completely unprepared for?

?????♂? I have. Knowing the problem space builds credibility with customers, partners and my team.

Customers love to talk about their pain points and often tell me much more than expected. The same thing happens when speaking to partner teams – I approach a conversation to learn about one problem and leave with questions about 5 more problems.

I use this information to build a mental model of the problem space. In subsequent conversations, when I show a backlog that addresses the entire problem space, I’m able to drive a more focused discussion with less time spent digressing into related problems.

To answer, “what problem are we solving?”, I must understand the problem space broadly - including the target user, the problem I’ve selected and other related problems. Then, I must clearly communicate justification for my selection, which leads to the next question.

2.   Why are we solving it?

You may have heard "finding your why" as a key to success in life. That's because a compelling why is an excellent motivator.

Effectively communicating why your team should solve a problem makes it easier to earn commitment.

To answer, “why are we solving it?” effectively, I always bring data and use an example.

Bring data

When I describe why we’re solving a problem, I define the size and impact of the problem with data.

Recently I demonstrated that one in every three (33%) users faced a problem, resulting in a 46% reduction in task efficiency. Other problems in the space were smaller in scope and impact. Stakeholders were instantly motivated to collaborate on a solution.

Bringing data might include a range of tasks - from surveying customers, to identifying patterns in qualitative data, to writing database queries.

Bringing data also includes establishing decision criteria (e.g. prioritize a problem affecting the greatest number of users) so that your team understands the reasoning behind your recommendation.

A quote from W Edward Deming. "Without data you're just another person with an opinion"

Data speaks louder than words and instills confidence in the team’s prioritization. Presenting data, establishing decision criteria, and communicating recommendations are important skills for garnering support for my decisions.

Use an example

To complement the data, I use an example scenario based on recent in-product data, including verbatim feedback from users.

To generate empathy, I give the user a name, provide context for their actions, and highlight how the problem hindered their goals.

A PM must show (vs. just tell) their team why the problem is important, then highlight the benefits that come from solving it - which leads to the final question.

3.   How do we measure success

Screenshot of the "How do I win" section in the Monopoly game manual

It’s difficult to win a game if you don’t know how; that’s why every board game manual includes a "How you win" section.

Similarly, every product team needs a clear vision of success to execute effectively. Answering this question is how I align my team on what success looks like.

The success of a feature typically involves some combination of 3 categories: engagement, satisfaction, and effectiveness. Said differently: “are customers using it?”, “do customers like it?”, and “is the feature working?”

Engagement (Are customers using it?)

No team wants to build a feature that doesn’t get used. Choose an engagement metric (e.g. clicks or active users) and pick a target.

I put a dashboard in place before my feature launches to measure engagement and track progress.

Satisfaction (Do customers like it?)

In-product prompts, surveys, interviews, and focus groups all work well for measuring satisfaction.

To make qualitative feedback measurable, I create a consistent set of questions and look for patterns in the responses (e.g. 8 out of 10 interviewees mentioned slowness in the UX).

Effectiveness (Is the feature working?)

Recall your answers to "what problem are we solving?" and "why are we solving it?". Good answers include who has the problem, the size of the problem, and the impact of the problem.

If the target persona no longer has the problem, if the size of the problem shrunk, and/or if the impact of the problem has been mitigated, it’s likely the feature is effective.

I’ve used both qualitative and quantitative data to measure effectiveness; however, qualitative data has been most effective in proving the impact of my features and fostering promotion discussions ??.


Clearly defining success helps my team measure progress and demonstrate impact. It also enables me to use data to identify the next problem to solve and start the process all over again ??.

Thanks for reading!

And please, share your thoughts in the comments. Let’s sharpen our PM skills together.

Thanks, Felix for sharing that, it was very beneficial for me as I'm still very fresh in PM knowledge! ??

Paul Oladimeji

Product Manager | I build and grow products from ideation to scale

2 年

Great post, Felix Watson Jr. I just discovered your profile and it's definitely a must-follow. To add, I particularly like the Opportunity Solution Tree (OST) framework by Teresa Torres for clarifying & crystallizing customer problems for the following reasons: 1) It frames problems as opportunities, which are easier to define & measure in terms of impact. 2) When doing customer/problem discovery, it is common to find that: [i] there isn't a shortage of ideas, and [ii] sometimes problems are broad or not clearly defined. In these cases, the OST helps to break down the problems into smaller ones that can (i) be more easily compared to each other in terms of impact and can be the basis for actual features, and (ii) help validate hypotheses since they are much smaller in scope.

Frank Ramirez

Head of Products/ COO | Product Management

2 年

A good read, getting a basics refresher for clarity is important. I v/much like the simple frameworks and prescriptive guidance. I think a bit more depth on prioritization is in order, also there needs to be some contextual discussion of resource limitations or else this may be perceived as an academic idealized ivory tower view divorced from reality. A lot of changes when you have substantive resource constraints. The burden of an entrepreneur or smaller firm is the issue of opportunity costs do I do more research, or do I build features? Is my qualitative non statistical data good enough? The UX is not great, but is it good enough? Should I allocate more budget to marketing or to development? Net, most business is small business and to be credible being a PM in a resource constrained environment face different challenge than PMs in fortune 50 organizations. ? Note: My POV is informed by my experience as a 4P manager at Microsoft /Qualcom/T-Mobile and then as a Startup PM. Knowing your customer is not simple, primary data is not as accessible, and the consequences for missteps are much more costly. This is one reason I am focused on making actionable insights more broadly available.

Joe Reynolds

Ops Associate @ Skylight | Technology, Product, Ops | Creating Access and Equity

3 年

"It’s difficult to win a game if you don’t know how; that’s why every board game manual includes a "How you win" section." I love this acknowledgement and wish this was something said more in relationship to life. Articles like this contribute to the community game manual we need to build for these roles. In regards to "What problem are we solving?", how do PMs normally receive their problems? Through discovery, business objective. Does that relate to what kind of PM you are: growth vs systems?

Manmeet Singh

Trust & Safety | Policy Manager | Content Policy | Program Manager | Policy Enforcement | Content Moderation | Content & Community Operations. "Enabling healthy on-platform conversations"

4 年

Very well written??

要查看或添加评论,请登录

社区洞察