Performance Data Based Analytics

How can you do big HR data analytics when you eliminate performance rating?

A well-known people analytics person posted this question on LinkedIn a week ago.  The discussion that followed was fascinating.  The question took my breath away for a couple of reasons: (1) performance ratings are one of the most bias-ridden data points on any employee, so it makes me queasy to think such data would be relied upon heavily in an analytic, and (2) so much more data exists that can provide better insights.

Let’s take the biased data first.  While I am no Marcus Buckingham devotee, his piece for Harvard Business Review back in February 2015 encapsulates the idea that HR data – particularly performance data – is bad data.  By in large, managers rate employee performance on how they would perform the job, not on the actual performance of the individual employee.  If, as Mr. Buckingham points out, 61% of a rating is really a rating of the rater, then ratings aren’t that useful in making people decisions.  (In a post for another time, performance management doesn’t even need ratings.)  This bad data makes for bad analytic results.

This is furthered by oodles of studies that show unconscious bias is baked into performance data.  There is no escaping it.  The unconscious bias of managers – which is not to shame managers, we all have bias – is active in performance reviews and data.  If women and minorities are rated more harshly because of the bias, then an analytic tool that relies on performance data also contains the bias in those harsher critiques and could perpetuate or exacerbate discrimination.  This can occur even when we remove demographic data through data proxies.  Who wants biased data in their analytic tool?

Next, the idea that HR analytics must rely on performance data misses the plethora of other data that we could use to make people decisions.  Here are but a few examples:

  • Great leaders have connections throughout companies. We can find out who has developed a network within a company by reviewing email connections, social media connections, and other network analysis.
  • Internal threats sometimes start with emailing themselves information on personal email accounts. Monitoring access (authorized and unauthorized) along with other retention analysis can help identify who could be stealing our trade secrets or confidential information to take to a competitor.
  • We can better predict how to scale hiring and what skill sets are needed based upon productivity and sales projections.
  • Things like weather, productivity, date, and time can all be factors in safety incidents. If we analyzed these items, we could develop a work schedule that reduces worker injuries.

These examples show employers could do better without touching the bad and biased performance data.  If we didn’t include biased and bad data, would it be the end of people analytic tools?  I emphatically answer “no.”  We could do even better without it.

 

Image by Markus Spiske available at unsplash.com

 

 

Those Pesky Correlations

Last week, the ABA held its Section of Labor and Employment Law National Symposium on Technology in Labor and Employment Law.  This is one of my favorite conferences, because I get to geek out with some of the most forward-thinking employment lawyers in the country.  This year was no different.

This year, we had two separate sessions on people analytics.  Analytics has been a hot topic for HR for the last five years (at least), and many employment attorneys are trying to play catch-up.  Using algorithms, artificial intelligence, and at times machine learning, analytics crunches data (employer, employee, social media, and/or public data) to find correlations that assist employers in making decisions.  These decisions can range from finding the best candidate to unmasking the employee stealing trade secrets.  Analytics are really, really cool!  But beware, problems exist.

A big one is discrimination.  We’ve known that analytics can be discriminatory for a while now.  We’ve seen discriminatory results in analytics in the justice system, advertising, and many, many others.  Because we’ve seen discrimination elsewhere, it could happen in people analytics too.  If it does, how will the law handle it?  Will a judge review an analytics case like she would a neutral policy that had a discriminatory result when used?  Or will a judge review an employment decision on an individual basis?  As people analytics develops, employment lawyers are finding themselves divided on how the law will deal with analytics that result in discrimination.

On one side, there are data scientists and a few management-side attorneys.  They think that when the analytics draws a correlation, that correlation is statistically strong, meaning the correlation has a strong relationship to the job or job duties.  For example, coders who visit certain manga sites are better coders than those that don’t.  The statistics show this, so the logic is that you should only hire coders who visit manga sites.  But what if black coders don’t go to manga sites, and now your analytic tool is weeding them out.  This certainly looks like discrimination.  But the logic of data scientists that the statistics support the idea that good coding and manga sites are linked and therefore job-related and a business necessity under Title VII.  People analytic vendors love this.

On the other side are some attorneys (and some industrial psychologists) who believe that the statistics alone will not be sufficient to prevail under the law – an employer must show more than just the statistics to overcome Title VII’s job related and business necessity requirements. Professor Pauline Kim of the University of Washington St. Louis School of Law argues that the correlations need to be both statistically valid and “substantively meaningful.”  She argues (and I agree) that there needs to be some connection to the job that’s more than just math.  If a coder is coding for a manga site, then the criteria that the coder visit manga sites make sense – it’s substantively meaningful.  If coding for a workplace software company, it wouldn’t.  This adds a “smell test” to the statistics that a jury can understand and hold on to.  People analytics vendors don’t love this as much because it means they would have to validate their tools using more than just statistics.

The debate at the conference was lively.  We just don’t know what will happen and what theory will prevail.  The EEOC is certainly paying close attention to people analytics.  Last October, the agency held a public meeting on the subject and heard from many different stakeholders on the subject.  Acting Chair Victoria Lipnic is very interested in where analytics is headed.  So am I.