“We should take a big step back and identify the business issue” – Interview Alec Levenson
Alec Levenson is a senior research scientist at the Center of Effective Organizations and a highly regarded academic thinker in the field of people analytics and writer of multiple books in the people analytics field.
His action research and consulting work with companies optimize job and organization performance and HR systems through the application of organization design, job design, human capital analytics and strategic talent management.
In anticipation of Tucana’s People Analytics World conference (London, 11 – 12 April) we got the opportunity to ask Alec some questions about his work in the people analytics space.
Q: Hi Alec, thanks for doing this interview with us. I’ve always been fascinated by your unique perspective on different HR issues. Let’s start with a simple question: What common analytics misconceptions would you like to bust?
Thanks Erik. I appreciate the opportunity to weigh in on the current trends in people analytics. I see two related issues.
Related (free) resource ahead! Continue reading below ↓
People Analytics Resource Library
Download our list of key HR Analytics resources (90+) that will help you improve your expertise and initiatives. Your one-stop-shop for People Analytics!
The first that the development of people analytics in organizations is a paradox. It seems like tremendous progress is being made, if you take at face value all the conferences, articles, books, blogs and social media chatter. And indeed there has been real progress in building the infrastructure for doing analytics in organizations, and the skill sets of the people leading the charge. Yet the depth of insights is not increasing at the same time.
According to a very recent study by DDI, the Conference Board, and EY, the reported success rate of analytics projects in a global survey of companies actually decreased since three years ago. We can pick apart and quibble with their methodology but I see no need to because their findings are exactly what I feared yet expected would happen, based on what I see taking place in organizations almost everywhere.
Oftentimes, there are quite meaningful insights that can be gained from simple analyses that can be done in a spreadsheet without fancy software, complicated databases or an advanced degree in statistics.
The problem is that it’s easy for people to fall in love with fancy statistical routines, largescale data bases (even calling them “data lakes” which sounds even more impressive), and whiz bang techniques like social network analysis and data visualization.
“There often is an inverse correlation between fancy statistics and business insights.”
Don’t get me wrong: in the right situation those high powered analyses are exactly what should be done. And I personally have spent years, going all the way back to graduate school, infatuated with the power of data and analytics. But they don’t take the place of being smart about the business. And, even worse, there often is an inverse correlation between fancy statistics and business insights.
Q: So, we do need to keep it simple stupid in Analytics? In your opinion, what’s the challenge, and how do we solve it?
The reason I say this is because some of the best social scientists and data scientists I know, whether working in companies, as consultants, or as researchers, are the last people I would look to for insights on what makes a business succeed or fail. Their skill sets are ideally designed for deep analysis of people issues and HR processes.
They are world-class at understanding how to carefully measure the impact of an HR program (training, leadership development, etc.), or validating the effectiveness of a recruiting or promotion process, or designing a robust survey, or coming up with a legally defensible competency profile.
All of those insights are very helpful for HR to do what it does slightly better. But they don’t answer the challenge Dave Ulrich issued over two decades ago, for HR to become truly strategic. Building world class people analytics was never the answer to the question, How does HR get a permanent seat at the adult’s table of leadership decision making in organizations? It can help provide insights in some areas but is not a panacea.
The reason I know this is because I and my colleagues at the Center for Effective Organizations have seen direct evidence of this time and again over the four decades the Center has been in existence. The most insightful analyses sometimes use fancy statistics, but more often do not.
A really good survey is often the source of deep insights, but only when coupled with in-depth stakeholder interviews that identify root causes of bad leadership decisions, poor alignment across functions and business units in the organization, cultural biases that put road blocks in the way of strategy execution, inability to read market signals and drive change in the right direction, and so on.
Over the past two decades I have had the privilege of working on some really deep investigations into the root causes of organizational performance, and employee motivation and behavior, and I have partnered with a small number of truly brilliant organizational diagnosticians.
What these org diagnosticians (my name for them, not what they would call themselves) and I do when we’re rooting around, looking for the true root causes of org performance, looks quite different from the current cutting edge in people analytics.
HR 2025
Competency Assessment
How future-proof are your HR skills? Take our 10 minute assessment to find out!
Start Free AssessmentOften there is a lot of number crunching, and maybe even a regression analysis; but more often the stats involved are quite simple, like constructing a table of averages from survey data or archival records.
There may be a survey, but if so it’s always designed from scratch – the insights available from the annual employee engagement survey are never sufficient. And the one constant, in every single case, is in-depth interviews of key stakeholders. Which is the one type of analysis no one in all the writing, blogging, and public speaking about people analytics talks about, and that makes me totally dumbfounded.
I know that the toughest organizational challenges can only be understood through those stakeholder interviews. Yet rather than train our analytics leaders and frontline troops to be really good at those kinds of diagnostics – which have much more in common with traditional organizational development than statistics and data crunching – we instead applaud fancy data visualizations and social network analysis.
Which is just not right. It’s not either/or, it’s both/and. We’ve totally dropped the ball on the “and” in this case.
Q: You’ve been involved in many people analytics projects, including one for a large Canadian financial institution. What key elements are needed for organizations to establish a people analytics function?
A lot of people have written on this, including my esteemed colleague Alexis Fink, who calls out four key types of expertise: content, data, analytics, and influencing. Note that only the second and third are traditionally associated with people analytics.
Content refers to knowledge about the business and its complexity, and what needs to be measured and how to measure it. Influencing refers to org development and related types of skills needed to influence decisions making in order to drive change. I agree with Alexis 100% on these knowledge domains.
Those four areas of expertise are foundational: any people analytics function has to have all of them for any hope of being effective across a broad range of questions and activities. Yet they are not enough to ensure strategic contribution. For that another key skill or perspective is needed, and this is something Alexis and I are working on right now.
One name for it is systems thinking but that name tends to turn people off because it evokes complicated ways of thinking about the organization, along the lines of Lewin’s force field organizational model, Katz and Kahn’s open systems theory, Nadler and Tushman’s congruence model, and Peter Senge’s learning organization.
What Alexis and I are working on is a way to identify the essence of those systems approaches, and make them more accessible to people working on the frontlines of people analytics.
Our current up-to-the-minute thinking is that the operating model of the entire enterprise (organization) may be the best way to focus people’s attention on the diagnostics that need to take place, and the models and data needed for those diagnostics.
The operating model, if specified correctly and deeply enough, can provide a powerful guide for prioritizing elements of the strategy, investment decisions, the organization structure, along with decision rights and business processes, all the way down to the team and job levels. We are currently working on how to conceptualize the key elements of the operating model deeply enough without making it overly complicated, which is a delicate balance.
So I guess my answer regarding the missing ingredient for a truly effective people analytics function is less something that exists in the team in terms of specific skills, and is more an orientation toward the kinds of questions that are addressed, and how to prioritize requests that come from the business.
I and many others have been saying for years that people analytics has to start with the business questions. But having seen the challenges of figuring out which business questions are more important to address, especially when the ease of answering them varies widely, I am now firmly convinced people analytics needs to be guided by a much more comprehensive, holistic view of the enterprise using something like the operating model.
That’s the kind of approach people like Alexis and I have used with great success in our decades of working with organizations, and we look forward to developing the content that will help guide others on a similar journey.
We’ll be rolling out the insights as we developing them over the coming months and beyond, and look forward to engaging with the people analytics community to refine and improve on them.
Q: A colleague in the field regularly asks me playfully: did you already find a fix for performance management? We all know that performance ratings are one of the least valid and reliable metrics but still analysts try to relate HR processes to employee performance ratings. How can we fix these ratings?
The short answer is: there is no fix, because individual level measures of performance are inherently biased on account of being subjective. They are subjective because it’s impossible to accurately measure individual contribution to group output when the tasks each person in the group does are interdependent with each other.
“There is no fix [for performance ratings], because individual level measures of performance are inherently based on account of being subjective.”
This is the flip side of the adage “the whole is greater than the sum of its parts.” The parts are the individual roles or jobs on the team or in the function or business unit. The whole is the output of the team, function or business unit.
It is literally impossible to measure the synergies among roles in such a way that responsibility for creating the synergies can be assigned to a single person. That, fundamentally, is why individual performance ratings are not valid or reliable, and never will be.
Q: Is there a better way to measure performance? Or, what other commonly available data would you use to gauge employee performance?
The answer is to lift your head out of being buried in the sand of individual performance, and instead focus on the level at which performance is delivered: the team/group, function, site/geography, business unit, whatever. Taking that perspective is one part of what I refer to earlier as taking a systems approach. You look for the “complete whole” that defines a tangible output: either a complete product or service bought by customers, or identifiable components of the product/service.
Examples of components include assembling a car engine; creating a new pharmaceutical compound; processing an insurance claim; handling a customer complaint through resolution; delivering the right truckload of items to be sold to a retail outlet at the right time; and so on.
In each of these cases, there is no one role that has sole responsibility for taking the process from beginning to end, and it’s impossible to assign specific responsibility to any one person for their contribution.
A big mantra in my book Strategic Analytics is that you have to start with the business issue to be addressed, and focus your analytic attention at the appropriate level of aggregation. That means starting at the highest level that is appropriate, whether that’s the entire enterprise, a business unit, a geography, a function, a site, or a team.
The first, and oftentimes only, analysis has to take place at that level to determine what’s driving the observed performance. This means that the best analytics often don’t ever bother with the individual level, and individual performance ratings never have to be relied on as a measure of anything.
Q: What would you advise an analytics practitioner that wants to start with people analytics and really wants to add strategic value?
Follow the steps in my Strategic Analytics book. Start with the real business issue at hand. If you’ve been presented with a narrow question such as “how do we improve retention” or “how can we maximize employee engagement” or “how do we improve the efficacy of an HR program” you should never take the question at face value only.
The first step is to take a big step back and identify the business issue at the foundation of the question. The second step is to start the analysis at whatever the appropriate level is to address the business issue (see my comments earlier about the appropriate level of aggregation for doing the first round of analysis).
“The first step is to take a big step back and identify the business issue at the foundation of the question.”
If you follow these steps you may very well end up back at the original question, but you will have the necessary context to put it in perspective.
The reason these steps are needed is because the question is never as simple as it seems. Improving retention is not simply about reducing turnover. The foundational question that’s most relevant for the business is ensuring the right people are in role doing the right things at the right time.
If turnover is high, one way to improve the talent supply problem is to improve retention. But that’s not the only way, and some approaches that improve retention could end up hurting performance if the issue is viewed only narrowly as a turnover problem.
You have to look at the business process in which the high-turnover role is embedded, and analyze all the possible ways performance could be improved, with reducing turnover only one of potentially many. The same kind of reasoning applies for questions about employee engagement, HR program effectiveness, and most other questions that will come your way as a people analytics professional.
Q: Can you give an example of an organization you’ve worked with that does really well? Can you quantify the results in terms of for instance cost savings, extra revenue or customer satisfaction?
No, because I have not seen any organization that systematically takes a systems perspective (no pun intended).
I have seen a number of cases of specific analyses that are deep and systematic, which follow the path that Alexis and I are talking about when we say the focus needs to be on the operating model. But those cases are the exceptions that prove the rule: even in those companies, the vast majority of people analytics are not sufficiently systematic enough to meet the standard of providing real strategic insights.
Quantifying the results of a specific analytics project is always a key outcome, and if benefits can be shown in terms of cost savings, extra revenue or customer satisfaction at face value that would appear to be enough to prove the worth of the insights.
However, having done this work for two decades, I always worry when I see quantified outcomes that some part of the bigger picture is missing. This gets back to my earlier point about the real business issue at hand. If your company’s strategy is to be the low-cost provider, or if the part of the organization you are analyzing is not a source of competitive advantage, then cost savings may be a perfectly appropriate outcome measure.
But cost savings can be achieved by inappropriately cutting back where spending should actually be maintained or even increased. For example, recruiting costs can be reduced by lowering the hiring standard, which leads to accepting more applicants even though they are not as qualified; as a result, productivity and profits can ultimately suffer.
There are similar potential issues with the other two measures. Increased revenue isn’t always good if the extra sales are inconsistent with the strategy. This can happen when salespeople push extra product out the door at the wrong price simply to make quarterly quotas, or convince customers to buy something that they later will regret and hurt the company’s long-term relationship with them.
Increased customer satisfaction for one subset of customers can be attained by giving away too many things for free, by cutting prices too far, or by overusing internal resources that need to be dedicated to other customers, which sets the company up for subsequent failure with that other subset of customers.
This is why a larger systems perspective is so critical. I can’t really evaluate the worth of a particular outcome such as cost savings, increased revenue or better customer satisfaction in a vacuum. There is a larger organizational context for any metric, and you have to make sure what you’re measuring is consistent with the organization’s strategic, financial and operational objectives – which can only happen if you take a system’s view when doing the analytics.
Alec will give a masterclass on next-generation analytics that uses systems diagnostics to improve work design and performance in London on April 10th, prior to People Analytics World.
Are you ready for the future of HR?
Learn modern and relevant HR skills, online