Contrasting People Analytics Approaches

Contrasting People Analytics Approaches

In this:

* Understanding how your primary goal affects people analytics project design

* Deciding on a method of planning and mode of operation for people analytics

* Designing a people analytics solution that will give you what you need

Too commonly, I hear that a company has spent years setting up a self-service data-dashboard (advanced business intelligence and data-visualization systems) only to discover it doesn't give executives what they want. Some examples include:

* “We implemented a self-service dashboard environment, but nobody uses it.”

* “We have lots of data - we’re drowning in data now -but what we need are insights.”

* “We have people working on reporting, but now we are looking for ways to get more business impact from our data.”

The good news is that these problems are avoidable if you determine what you’re looking for, design each people analytics project to give you what you need, and communicate with others accordingly.

To simplify the enigma of people analytics possibilities, I offer three of your primary project considerations with two options each.

Figuring Out What You Are After Efficiency or Insight

Here’s the first question to ask in your people analytics project: What are you looking to achieve? In my experience, companies embark on people analytics projects primarily for two different reasons. A) to increase the efficiency of answering many different common questions (reporting) or B) to answer new questions (developing unique insight).

Either objective can bring your company value, and both are important, but it is in mixing up the two in a single project that creates disappointment. Figure 3-1 shows an overview of the two approaches, with the efficiency-oriented data project emphasizing systems design and insight-oriented data project emphasizing analysis design. In the following sections, I talk about each one so that you can decide what is your primary focus for each project.

Figure 3-1: The steps in a project focused on efficiency look nothing like the steps in an insight-based project.

No alt text provided for this image

Efficiency

The classic example of an efficiency objective is that you want to use a system to automate reports that someone is already regularly producing on desktop software applications (like Excel, Tableau, or R). For example, you or someone you work with may work for 40 hours per month to produce a regular update of Headcount, Headcount Growth Rate, Hire Rate, Exit Rate, Promotion Rate and Time to Hire, which go into a slide deck for the executive management team. These represent seven out of over one hundred possible HR metrics. The process to achieve the visualizations is cumbersome, time-consuming, and error-prone because of the human effort required to get the data necessary to produce the metrics from systems, bring it together, perform calculations and then make it into graphs. You can perform this effort for a limited number of metrics and segments, not all relevant metrics and parts of your company (say, by division, business unit, location, job, tenure, or gender). To increase the number of metrics and parts you can produce each month, you will have to hire more people to perform the work.

Most modern reporting systems can automate the actions required to produce hundreds of HR metrics and make them available to many audiences - using a single visual dashboard that can be segmented or filtered so the user can get just what they need — say, by division, business unit, location, job, tenure, or gender. If you are hoping to implement data architecture and systems to eliminate the hard work to get data, you or others need - or make it less effort – this is an efficiency objective.

When you pursue an efficiency objective, your most significant choices relate to the data architecture and reporting systems you will use to replace human effort with a machine (automation).

Reporting systems evolve so that even the most cutting-edge systems seem out of date within a few years. Once you begin on this path you will likely feel pressure to overhaul regularly to keep up with fresh new features, but, frankly, today’s reporting systems perform the same fundamental objective they did 20 years ago: produce the metrics we need all the time with as little human effort as necessary given the current state of technology. That is, their primary objective is efficiency; the rest is just the outfit you put on it.

If increasing the efficiency of your analytics efforts is of more immediate and pressing interest to you than discovering new insights, you have many resources to learn more about this. Books like Business Intelligence For Dummies (by Swain Scheps) and Data Warehousing For Dummies, 2nd Edition (by Thomas C. Hammergren), all from Wiley Publishing, can be quite helpful for understanding the high-level concepts and choices you have to make.

Insight

Emphasis on problems and questions is a hallmark of an insight-oriented people analytics project. You start with a problem you want to work on and then use data to answer questions that you believe will help you better understand that problem.

When you are looking for new insight, your most significant efforts relate to defining the problem focus, the questions that you want to a, and the design of an analysis workflow that will offer some insight to resolve the matter. The best insight projects are rooted in the scientific method. You begin with ideas and collect the specific data you need to confirm or reject those ideas, as opposed to hoping to stumble upon an idea you find buried in data accidentally collected in some system designed for other reasons. Though you might set up systems to continue delivering a particular set of data over time, the problem you are analyzing dictates the data you collect and the systems you store it in, not the other way around. The scientific method directs your attention to the particular data that matters for that problem - as a result, the scientific method has many advantages over alternatives. However, the scientific method requires a way of problem-solving that may be foreign to the general mass of people, not trained in science.

<Tip> The scientific method doesn’t necessarily require much or any technology. In some cases, though, statistical applications can be a big help. Minitab, R, SAS, SPSS, and Stata are some popular choices. In the right hands, they can help you use advanced statistical methods to help you tease meaning from the data you collected in the process, increasing your certainty about the insights gained.

A people analytics initiative to achieve insight is better at addressing specific problems and helping with particular decisions.

Having your cake and eating it too

You are interested in efficiency and insight for different reasons. Sometimes you are looking for answers to new questions, which may require new data and new approaches (developing unique insight), and other times you are just looking to improve the workflow you use to get answers to common questions that occur over and over again (developing efficiency).

By forcing you to choose, I have offered a simplified view of a complex continuum of options and decisions. The best data environment for people analytics meets some needs of both efficiency and insight needs. That is to say that the best data environment addresses standard reporting needs and can also be leveraged in investigative analysis to produce new insights as well. However, even with a data environment designed to do both, you cannot automatically assume that implementing a reporting system will automatically do both.

It would be best if you also did not assume that projects designed to produce reporting efficiency need to implement in advance of projects designed to create unique insight. If designing an efficiency-oriented reporting environment will take you years (and several million-dollars in labor and software) to build, and you can do the analysis that will produce uniquely valuable insights without that architecture, then why wait? When performing work that others value, you may provide the justification you need to get agreement to invest in automating the repetitive actions in a system so that they can be performed more efficiently in the future. On the other hand, if the effort to produce a new insight only needs to be completed once or fails altogether, then automation is unnecessary. For this reason, I promote the counterintuitive idea that it is better to start with insight projects. 

Deciding on a Method of Planning

Another question to answer in your people analytics project is about how you plan to manage it: using a waterfall approach or an agile approach.

Waterfall project management

Waterfall project management describes a method that is linear and sequential toward a known outcome. The plan includes every stage of development from the beginning, and those stages do not change. No step can start until the preceding stage completes. It’s the traditional project management approach.

Imagine a waterfall on the cliff of a steep embankment. Once the water flows over the edge of the embankment and begins its journey down the side, it’s too late to choose a different embankment to fall over. It’s the same situation with waterfall project management: Once you complete a phase of development, the development proceeds to the next stage, and there’s no turning back. Figure 3-2 illustrates the steps that can appear in a waterfall project plan.

Figure 3-2: Once the water (project) has fallen over a rock on the cliff (step in the project plan), there’s no going back.

No alt text provided for this image

In a waterfall project, you have to determine all the stakeholders’ needs correctly from the outset. Therefore, waterfall is best suited to scenarios where everyone already knows the solution, such as frequently used standardized reports. When you can determine precise project requirements upfront, the Waterfall approach is excellent. If the standard reports and data visualizations that other companies have already implemented successfully suit your needs, then waterfall is a fine approach, if for no other reason than it is broadly understood and respected.

Agile project management

Agile project management describes a process in which you make a little progress, pause to evaluate the situation and adjust the plan, advance a little more, tweak some more, and so on. Each plan, design, build, and test iteration is known as a sprint.

If you know at the outset where you are going, mapping the path of the waterfall is obvious. However, if you don’t know exactly where you want to go, or if you know conditions will likely change along the way, the waterfall strategy makes you commit to too many design decisions up front.

For example, at one major technology company, I was intent upon studying employee commitment, defined as the likelihood of a given employee or group of employees would stay or leave in a one-year time frame I developed an employee survey with an intent to understand what explains, drives and predicts employee commitment by mathematically correlating employee response to a series of independent items with how the same people respond to an intent to stay item (which eventually I would validate by a precise analysis using actual exit data). The problem to resolve was that at the outset, I could not possibly know what things drive commitment, and therefore I would not know what should not be on the survey. Is it managers? Is it pay? Is it the employee’s reaction to the nature of the mission of the company? Is it a perception of the benefits program? Is it positive or negative interactions with work colleagues? Is it some element of culture? Is it the job design? Is it the organization design? Is it leadership? Depending on whose research I was reading or whom I spoke to, I would get a different answer. The list just kept getting longer and longer. Eventually, I concluded that I could not reasonably ask everything that might matter in one survey. The questions I add to the study would imply a theory that I know the answer to questions already that clearly, I didn’t know. An agile project design permitted me to test several preliminary random sample surveys, modifying between each run, until I had reached a reasonable conclusion on what questions would be best to include in a broader and more permanent survey effort. With each iteration of the survey, I removed some items my data analysis showed were not useful and added some new things to test by the space freed up from the last test. What makes the project I described Agile was the iterative design. At the outset, I did not know how many tries I would need to arrive at the ultimate plan. I kept going until I could reasonably conclude I had reached a sufficient number of the items that matter to explain, predict, and possibly control employee exit in the future. 

<Remember> The problems with people can be uncertain, can appear as one thing but be another, or can start in one place and move to another. Each issue requires a different analysis and different data to solve it, so you don’t know precisely what additional data and analysis you will need until you get into the work. More importantly, your full understanding of what the real problem is may not develop until you progress well into a project.

The agile approach calls for breaking down projects into smaller pieces, releasing segments of the project as you finish them, and requiring closer collaboration between technical and nontechnical stakeholders. It is well suited to the investigative methods you use to gain insights. By employing more rapid deployment of features, you can evaluate the impact on the behavior and experience of users quickly, reevaluate where you are, and choose your next direction. Figure 3-3 shows a visualization of the iterations in an agile project.

Figure 3-3: In an agile project, you complete a segment and then “come up for air” before starting on the next sprint.

No alt text provided for this image

An agile project requires technical and nontechnical teammates to evaluate their work together as they move through the process. The shorter time intervals separating conversations between technical and nontechnical stakeholders create valuable learning opportunities for everyone. The agile approach works beautifully for learning: It recognizes the value of everyone, accepts early failures with a constructive attitude, and facilitates a means to work together to solve problems.

<Remember> An agile approach requires you accept an imperfect or incomplete outcome temporarily; however, over the long term, it can get you where you need to go faster and at a lower risk of large-scale failure than the waterfall approach. You don’t want your systems people to labor for a year before you find out that you didn’t need those features or that this report just wasn’t as useful as you thought it would be. Agile helps you find this out sooner rather than later.

Choosing a Mode of Operation

One characteristic of people analytics that I need to talk about is whether you have a centralized team that takes responsibility for the projects or whether you spread this effort out beneath different functional umbrellas. One of the questions I hear most frequently is this: If people analytics is a new job, where should the person or persons in this new job report?

Some companies have a team of people analytics professionals focused on and embedded in each business division, such as sales, engineering, and operations. Others have people analytics professionals that are embedded in and support the different sub-functions of Human Resources (talent acquisition, compensation, benefits, employee relations, diversity, learning, and development…) Others have a central team of people analytics professionals focused on the whole company in one group. Figure 3-4 illustrates how people analytics teams might all report to the same place or each to its division or HR Center of Excellence head.

Figure 3-4: A centralized approach versus a distributed approach.

No alt text provided for this image

<Tip> If you’re in a smaller company, you likely have fewer options simply because you have fewe resources. Your company might not even have enough roles to make a dedicated team of specialized analysts makes sense. Engaging local college professors, grad students, consultants, or analysts in other parts of the company (marketing, finance, IT, …) to fill in the skills gaps is always an option.

A company that starts with a loosely formed group might decide to centralize that function and invest in more dedicated resources as people analytics proves its value to the company over time. However, both approaches have pros and cons, of course, and the distributed approach offers its benefits even if you can afford many dedicated resources.

Centralized

Centralizing the people analytics function lets a company reduce the inefficient redundancies of distributed mode.

Leaders of companies that centralize the effort tend to hire people with advanced degrees in niche analytical skillsets -superhero nerds, you might say. Focused analysts have the benefit of full-time dedication to a clear objective. They get to work with a team that was hand-picked to bring all the skills to the table.

A centralized people analytics team has a perspective that spans the entire company. Though a team dedicated to a single business division or HR center of excellence (a team dedicated to a separate area of HR, such as recruiting, training, or employee relations) can focus tightly, it might miss company-wide opportunities or data sources that a centralized team would have the perspective to recognize.

<Tip> A considerable benefit is the ability to characterize the whole organization to the Board or CEO as well as each business unit or HR function. You get a “full view” of the organization; this is impossible for a distributed team to produce since they typically will not have access to the data for other functions or their facts. In addition to not being able to speak to those functions or set of facts directly, they also risk losing the point of reference that is necessary to understand what they see in the function they support.

However, the downside of centralization is that it moves the people who generate the people analytics data further away from the people who make decisions based on the data. Among the groups that the people analytics team supports, this separation can create the sense that the people analytics team doesn’t understand their needs.

Many times, a business division or HR center of excellence hires its analyst or creates its team when it isn’t receiving the support it needs from the company-wide centralized team. This team is sometimes called a shadow team. The term shadow may sound cynical, and a shadow team is indeed widely believed to be an inefficient use of resources. On the other hand, it’s hard to fault group members for investing their resources to meet their own needs.

Centralizing the team also risks overwhelming your company’s people analysts when multiple parts of the company come asking for support at the same time. Word spreads, and a centralized team can find a line of groups wanting their share of help. The centralized team has to prioritize the requests, which can create dissatisfied internal customers who feel that the centralized people analytics team is disconnected and bureaucratic and not doing enough to support them.

I realize that merely pointing out the conflict doesn’t help you much. I’ll write later about how to navigate this conflict using an overarching analytical framework I invented called Net Active Value (NAV). Keep following me.

Distributed

A distributed structure is an inclusive model: Anyone and everyone in the company can potentially contribute to a people analytics project. Sometimes people call the distributed model embedded analytics because the tasks are embedded directly into each employee’s job.

With the responsibility for people analytics distributed, there is no question that the people doing the work understand the problem they’re working on because it’s their problem. The team leaders are more likely to act on the information from the analysis because they completed the study themselves. They can understand and trust their insight.

This hyperfocus on sub-system problems can be a pitfall as well. For example, the Recruiting team might not know about or consider the data that the Compensation team uses, even if it’s relevant to the problems of recruiting. Even an untrained analyst might imagine a theory that market pay could be related to hiring — just a theory. At the same time, the analysts on the Compensation team might not think about the benefit other groups can receive from their data, because they have enough to do without adding additional responsibilities.

The most obvious potential problem with distributed people analytics is that the people with the skills you need might not have the time or willingness to take on another genre of tasks.

<Remember> Even with its drawbacks, the distributed approach is growing in popularity. Increasingly, even large companies are finding that a rigid, top-down, centralized people analytics organization is not tenable. However, identifying and organizing a truly distributed network of expertise that is capable of advanced analytics can take years. If your company ***hasn’t*** been hiring people all along into all areas of HR with analytical skill sets, it's facing an uphill battle. 

A consensus is starting to form that the best architecture is one that blends centralized analysts with a more substantial investment from distributed stakeholders. It’s the 'we’re-all-in-this-together' approach. You can’t expect everyone to be able to run advanced statistics. Still, you can expect them to be more conscious of the range of possibilities and be more analytical, and to use data when making important decisions.

Making your way

Too often, companies try to copy others’ success and call it “best practices.” Just because Acme Company did something and saw revenue increase doesn’t mean that its strategy will work for you. Sometimes comparisons are compelling, but doing the same thing someone else did will never be a business differentiator.

I have been in a drawn-out war against the idea of best practices for a long time. Implementing best practices is copying what someone else did because it worked for them, without consideration for the unique nature of your own business and situation. I’d rather we call best practices what they are: guess practices. Blindly copying is not a long-term strategy.

Yes, you can learn from other companies. The most important thing to learn is that the successful ones defined what would make people analytics work by looking at your own goals, resources, and culture. Start small if you have to, and look for opportunities to adapt and grow. Your business will benefit more from an analytical, data-driven approach to your problems and decisions; then will come from the serial application of best/guess practices.

This is an excerpt from the book People Analytics for Dummies, published by Wiley, written by me.

Don't judge a book by its cover. More on People Analytics For Dummies here

I have moved the growing list of pre-publication writing samples here: Index of People Analytics for Dummies sample chapters on PeopleAnalyst.com

You will find many differences between these samples and the physical copy in the book - notably my posts lack the excellent editing, finish, and binding applied by the print publisher. If you find these samples interesting, you think the book sounds useful; please buy a copy, or two, or twenty-four.

Three Easy Steps

Well written Mike ! Kudos !

回复

要查看或添加评论,请登录

社区洞察

其他会员也浏览了