How to Execute High Impact People Analytics Surveys
Surveys can be a fast and effective way to generate employee insights in service of desired organizational outcomes. They also too often result in little change or action. On the People Analytics team at Genentech, we’ve spent the last few years refining our process to maximize impact. During a recent internship, Thaddeus Demeke helped collate our six stage survey process into a “Survey Playbook” that serves as an internal guide for future projects. In hopes of contributing to the development of successful survey practices and generating more conversation on this topic, we’ve outlined the key elements of the playbook below.
Stage 1: Define Goals
There are three steps within the “define goals” stage: Define business goal and scope → Define method of insights delivery and timeline → Develop hypotheses
To assess whether a survey project is a good fit for our team we evaluate it on four primary criteria:
- A compelling business goal (which, for us, typically means specific, high impact, people decisions that the data will directly inform)
- A need for in-depth statistical analysis (without which our team is not needed)
- A willingness to collaborate (and cede some degree of control)
- Availability of People Analytics resources
If we decide the project is not a great fit for our team we will offer light consultation to help the client more successfully, and independently, run the survey. This allows us to be helpful without taking on a heavy workload and also makes it more palatable for the client when we decide not to directly manage the survey project.
Stage 2: Develop Survey
There are four steps within the “develop survey” stage: Select measures → Write questions → Draft survey language → Collect and apply feedback
It can be tempting to begin crafting questions very early in the process. In our experience, it’s more effective to take the time to clearly define the survey’s purpose and goals before beginning to think about questions. In this stage, it’s critical to understand who the key stakeholders are and who else will be providing input (as well as when their feedback will be solicited and “due by” so that timelines can be met). When we develop the list of questions we include a column titled “rationale for asking” so everyone is clear on the unique purpose of each individual question. This section of the playbook is also full of tips, tricks, research, and internal best practices (standard scale types, statistical tests, and visuals) for how to design an effective survey in general, but also specifically within our unique organizational context. This makes it easier to onboard new team members to our processes and also serves as a helpful reminder for the broader team.
Stage 3: Test and Launch Survey
There are three steps within the “test and launch survey” stage: Identify target population → Test survey → Launch survey
This stage is probably the “least sexy” but is nonetheless important. You don’t want to spend weeks planning a survey only to have the link not work, or data collected anonymously when it was supposed to be collected confidentially, or questions with typos, etc. We send a test survey email to everyone on the team so they can proofread and also test the flow to ensure everything is working correctly (from front-end to back-end). After identifying the target population, we will also determine which data (i.e. employee attributes such as department or tenure) will be critical to the analysis and whether it is available in our existing data pipelines vs. a “custom wrangle job.”
Stage 4: Analyze Results
There are two steps within the “analyze results” stage: Analyze data → Confirm hypotheses were correctly tested
The “analyze data'' step can vary dramatically based on the survey design, stakeholder preferences, and use cases of the audience. Ideally the survey project manager has achieved awareness and alignment in advance so that stakeholders know exactly what hypotheses will be tested and what additional analyses will be run. The more clarity you have around the purpose (again, ideally to inform specific decisions or actions) the more naturally this stage tends to flow. Even with effective planning, we’ll typically still have a fair amount of collaboration between data scientists and the survey project manager during this stage. Over time, we’ve built our survey platform so that particular statistical tests run automatically and deeper data science is enabled (e.g. by defining the Python data type for each field ahead of time).
Stage 5: Communicate Insights
There are four steps within the “communicate insights” stage: Create insights presentation → Collect feedback on analysis interpretation → Collect feedback for final deck → Share survey output with client
This stage can vary quite a bit too but we are typically either delivering an insights deck, a dashboard, or both. Our dashboards are built using the Python-based Plotly “Dash” tool. This allows us additional flexibility to deliver insights in the optimal format for each survey project. Over the past year, we have begun creating libraries of code that allow us to more quickly and easily generate our commonly used plots and dashboard functions. Dashboards often incorporate custom filters based on what is most relevant to a particular project or set of stakeholders. If we are creating a deck, we will typically begin with an outline so that we can get really clear on the storylines before moving to visuals and slides. Most of the deck visuals come from the dashboard, the separate statistical analyses conducted by data scientists in Python, or something as simple as a google slides table. As with every other stage in the process, feedback is key to success. We make sure the whole team reviews the final output along with other stakeholders, project sponsors or just colleagues that are willing to lend a helping hand. Everything makes perfect sense when you’ve been staring at it for the last two weeks - fresh eyes often see things much differently. In the deck, we’ll typically lead with an Executive Summary that clearly lays out the key insights along with recommended actions for each. We also include “headlines” (the ultimate takeaway from the visual) written in plain English on each slide so it is clear what the data is showing (e.g. a header could read “Employees in manufacturing are more likely to view their manager as supportive” followed by the corresponding visual). Ultimately, the goal is to keep the slides brief - highlighting actionable insights and saving additional details for the appendix.
Stage 6: Conduct Retrospective
In this final stage, the survey project team gathers to explore:
- Things that went well
- Areas for improvement
- Future actions to take in order to improve next iterations
The survey playbook is then updated to reflect desired process improvements for future survey projects thus completing the circle of (survey) life. In this stage, it is important to give everyone involved a chance to provide input and then be very clear about what comes next for each point - no action taken, process changed, further investigation needed, etc. That way, this exercise doesn’t become a black hole of reflection.
Call for Feedback
How does your People Analytics team manage survey projects? Is there anything you’ve found critical that we are forgetting? Or elements that stood out as helpful that you may try adopting into your own processes?
We would love to hear from you in the comment section below.
Talent bei Shiftmove (ex Vimcar, ex Avrios)
3 年Super interesting article, like highlighted defining clear cut written down goals and hypothesis, in the beginning, is one way to be set up for success in having a high impact survey. Also, a retro is another part I definitely love.