DON'T PANIC: How to pass an interview like an Amazonian

DON'T PANIC: How to pass an interview like an Amazonian

I perform a lot of interviews here at Amazon.?For most, the process involves a phone screen followed by what we call a loop, basically 5 people asking you about your work history for about an hour each. I can't divulge the specifics of that process as it varies somewhat from team to team, but I can say that people who pass the loop offer very detailed answers across a range of examples.?They define the business problem, describe metrics to measure current state, identify meaningful drivers for change, and then benchmark their success, in the process ensuring they have done enough to improve the process but not go so far as to have over-engineered a solution.?Bonus points for proving that you automated something in the process.?If you can do this for at least 10 different examples, you will fare well.? I'm going to use this post to demonstrate the difference between a bad, good, and great answer so you can come to an interview with any company prepared like an Amazonian.

Let's take a question you'd be unlikely to encounter in your loop but a good example that mirrors questions you might encounter. Then I will give answers with different levels of detail and how I would rate them so you can internalize the differences in how you might respond to questions like this in the future.

HYPOTHETICAL QUESTION: "Tell me about how you know you made a good product?"

A BAD ANSWER: "I always do good work, I'm very prepared, so I did my homework and I made sure we covered what our partners asked for. I met all the requirements in record time under budget."

WHY IS THIS BAD: This tells me you think you did great work, but I have no evidence of how you measured your efforts, who was impacted, and exactly how much you met deadlines. I cannot assess the impact of your efforts. I can't really tell much of anything about your work, honestly, other than you telling me you are great at what you do. How do I know you did anything here I would not expect of anyone in the role? In fact, how can I even tell you did anything and not just coast on the collective efforts of the team? I just have to trust you. That won't get it done.

A GOOD ANSWER: "I've been on a ton of projects like this in the past, I think we've launched at least 20 of these things in the past few years. I have kind of a standard process now, first I sync with my manager and then my partners, in this case I had to work with Legal and our development team to make sure we met the requirements. We have 4 KPIs and we checked that we hit them all. We budgeted $1M for this project and we had 6 months to complete it, and in both cases we beat those targets. Our roll out was great too, we found that within 6 months almost everyone used the solution."

WHY THIS IS GOOD: This starts to give me an idea of how often you've done these things in the past and at least lets me key into reasonable follow-ups, like "Can you tell me about your standard process?" or "What are your KPIs?" In this I also start to get an idea of the size of your projects and how you view urgency and time. It alludes to benchmarking and impact. It could do more with specific measures, but overall this at least lets me as an interviewer take note and ask relevant follow ups where it makes sense for the role.

A GREAT ANSWER: "In the past 5 years, I've launched 19 different projects, three of which had almost the same key requirements as this one, namely it needs to service 1200 concurrent users, it has to track engagement over time, and it has to be deployable in our three key regulatory environments of Asia Pacific, Europe, and the Americas. The regulatory challenges actually were our biggest hurdles because the privacy regulations differed in each of these environments. To address this, I identified the key stakeholders and proactively set up weekly touch bases with our regulatory counterparts in these regions. I then would bring this info back and work with the product team to tweak things as necessary. To keep account, I built a risk register, logging 21 critical issues, that we worked against with deadlines on when the issues had to be addressed by so we would not miss our launch. We also identified any that were project stoppers, things that would kill the project entirely. The toughest one was definitely how we dealt with retaining information in Europe, as you probably know the privacy regulations out there can be difficult to manage. We found a work-around by leveraging our intake mechanism, where we captured personal information but then we created an encryption mechanism so that the actual data we stored on performance was housed in a separate warehouse than the demographic information. While I didn't build the actual database systems or the encryption models, I worked with the Legal and Product teams to make sure they could align. Once we thought we had a workable prototype, then I brought the pilot solutions to our three key markets to ensure that the prototype was viable in our different key markets. We ended up user testing with a little over 70 people, about 10% of our expected users, and we then took their feedback to change our layouts, as well as create quick launch dashboards that got folks to the parts of the product more efficiently and effectively. We felt good when we saw our adoption rate go up to 54% in the first month, given our target was 50% by month 3, and we were able to phase out of the legacy systems in 6 months, 3 months ahead of schedule. Still if I had to do it all over, I think we could have done more with beta testing the onboarding instructions. We thought it would be easier for folks to come up to speed, but we realized we got far more tickets on how to onboard than we expected. Some of that was how the onboarding broke down with differences in languages. In hindsight, that could have been handled better where if we had implemented an online training program, we probably could have automated the process and solved a lot of issues pre-emptively."

WHY THIS IS GREAT: This answer provides context on the familiarity of the respondent with the process of launching products like this, as well as gives me confidence they demonstrate experience in this space. While a little wordy, it's generally a fluid accounting of unique checks and balances this person put in place and all the elements address the central question "How do you know this was a good product?" It covers how they researched the problem, how they approached a solution, how they validated the solution, and what they could have done better next time. It doesn't restate or overly explain any one element, instead it offers a high level assessment of process achievement with a dose of self-reflection. Perhaps most importantly, demonstrates a mastery of the metrics of importance and how you measure impact. When I get a response like this I'm pretty much like "yep, this interview could end now, this person seems legit."

WARNING NOTE TO AVOID REPETITION: You should note that if this person had 8-10 other stories like this, they'd probably make the cut. If they told this exact same story to the other interviewers, we would most certainly assess that this person demonstrates great potential but not a ton of breadth in a role. Without a range of examples, most candidates would be considered more junior in their career with high upside potential. Bring a range of examples so that you can get better visibility to all you've achieved.

With this perspective in mind, take the time to measure the impact of your achievements and come prepared to speak to how you measured success. If you do, you will have your best chance at landing a job. And remember, these folks want to learn more about you, so stay relaxed. If you can enjoy highlighting how you approach problem, your audience will appreciate it even more.

Good luck in the process. And if you find this helpful, please consider sharing it with anyone else you think could benefit from reading this as well.

Kristian Zivkovic

Moving Specialist | Logistics for Property Management and Real Estate Companies

9 个月

John, thanks for sharing!

回复
Deborah Honea

Head of Business Governance @ AWS

2 年

Great article John! ????

Michael Klein, PMP

Program and Project Manager | Helping Customers Simplify their Lives

2 年

Great post John. Good to see your smiling face!

Tarini Mathur

Data Analytics & Insights | Revenue Reporting | Marketing Strategy | Ex Pinterest, TikTok

2 年

This is amazing John!

要查看或添加评论,请登录

John Smythe的更多文章

  • What to Test For: Statistical Significance or Data Stability

    What to Test For: Statistical Significance or Data Stability

    Throughout my career, I have always championed that we base our sample size off of the confidence intervals we look for…

    2 条评论
  • 10 Career Commandments To Live By

    10 Career Commandments To Live By

    Among many others things, I believe my career gods want me to remember these ten things..

  • The Biggest Mistakes In Presentations

    The Biggest Mistakes In Presentations

    I've had the luxury of sitting through many great presentations, but it's the horrible presentations that really stick…

  • Messaging: Kid Friendly, Executive Approved

    Messaging: Kid Friendly, Executive Approved

    When people try to create opportunities to improve something, very often they get trapped in the logic of what makes…

  • Succeed With A Clear Quality Target

    Succeed With A Clear Quality Target

    In life, not all things are black and white. This becomes especially apparent as companies strive to define quality…

社区洞察

其他会员也浏览了