When folks hear about what pymetrics does*, there are generally two reactions:


  1. “Oh! That is so cool! What an awesome concept, I can’t wait to try it!” These people are obviously highly intelligent, enlightened individuals who probably all have great taste in movies and spend their free time saving puppies and volunteering.
  1. “Uh...that sounds really scary. Are robots going to take over ALL OUR JOBS??” While it is quite flattering to see people immediately assume pymetrics will take over the world, it’s a bit of a paranoid way to see how data can be used to solve very real problems.
There is a constant struggle to find the balance between data-driven decisions, and human oversight/involvement.

Google, a company that is famous for its data analytics prowess, approached people management and hiring in a very Google-like way: with algorithms. The (Buzzfeed-worthy click-baitily titled) article, "Google came up with a formula for deciding who gets promoted—here’s what happened", is intriguing. The conclusion is something that we all need to take to heart:

“Setty’s takeaway is that people need to make people decisions. The People Analytics group exists to arm decision makers with better information, not to replace them with algorithms.

Data is awesome and fun when it can be applied to solve problems that flummox humans. Things like producing product recommendations for customers, honing marketing and advertising strategies, and even giving medical diagnoses. So, if algorithms know all, how much should humans help? Algorithms might be more accurate and effective on average, but what happens when data is used to make decisions that can have lasting impacts on someone’s life? Things like bank lending practices or insurance policy approvals? Then things get a little scarier. Algorithms might be more accurate overall, but in cases like these, small missteps can cause huge problems for someone. People are not all about getting denied loans or insurance because a data algorithm may have deemed them to be high risk.

Solutions do exist: human oversight and adjustments can be made to soften the data conclusions to favor people in these situations.

But the bottom line is this: technology is agnostic. It’s neither wicked nor altruistic; it is the way that technology is implemented that makes a difference.


*JUST IN CASE you don't know what pymetrics does: we are a neuroscience-based self-assessment and career discovery platform. We use neuroscience games to determine your trait profile, which is then used to recommend careers that are compatible to your cognitive and emotional traits. Cool, right? Check us out at www.pymetrics.com!