Recruiting
January 28, 2020

Talent Curation 101: Selecting Candidates for Fit not Familiarity

As John Finn pointed out in his piece for Screen Rant, most people don’t understand search engines: “To really highlight the problem, a recent survey found only 37% of people had a clear understanding of how Google Search works. In addition, 54% said they placed more trust in the website ranked first in the results than the others.” The problem is that the results may have been purchased or engineered. The Internet is a bit of a popularity contest involving SEO, links, paid ad placements, and monitored activity levels. But instead of presenting content objectively, is the Web simply promoting the prevailing opinions of the moment? In the staffing industry, where “talent curation” has become one of the latest catch phrases, it’s critical that we understand the proper application and practice so that we’re curating talent, not biases. 

Scrolling, Trolling, Searching, or Researching?

Have you ever scoured the Internet to conduct research on something you don’t like? Two years ago, I did just that. There’s a particular band I can’t stomach. And I share this opinion freely online. But here’s where the irony of life seeps into the digital world. Google’s algorithms, astonishing as they are, ostensibly deem me one of the band’s biggest fans, merely as a result of posting critiques about them or searching for unflattering parodies of their biggest hit. Why? Well, it’s a problem with content curation and the individualization of information. It’s a problem with how search engines actually rank for authority and relevance. Searching for something shouldn’t presume adoration; but the Internet hasn’t quite mastered that lesson yet. 

Whenever I hear a radio DJ cueing up this song I loathe, a surge of that flight-or-fight instinct consumes me. You know, the one that protects us from danger or the spirit-crushing torments of dark, uneasy things whispered only in purgatory. I’m not saying this band is a terrible, awful, stupefyingly dismal tribe of musicians. It just is to me. And I’m not saying their inexplicably beloved hit is a soulless, harrowing, agonizing assassin of joy. It just is to me. One person’s trash, they say, is another person’s treasure. 

But to Google, this band is my jam. The more it appears in my search queries, albeit derisively, the more Google’s automated Assistant promotes the group in my “personalized” results. 

The concept of developing catered, curated content for a specific target audience is nothing new. Corporate-owned media have played in this space for decades, pushing stories that most appeal to a publication’s core demographic, based on data. During the early history of the Web, search engines like Google began aggregating news from a variety of sources. Then, as the process evolved, they started distinguishing more tailored articles for individual users.

For people who want to validate their deeply held beliefs or opinions already firmly entrenched, this practice is likely a welcome boon. However, for those who want a 360-degree view of events to formulate a more objective perspective, personalization becomes problematic.

Floating Within the Filter Bubble

Back in 2011, Eli Pariser, CEO of Upworthy and former executive director of MoveOn.org, published “The Filter Bubble: What the Internet is Hiding from You.” The book exposed how search engines feed users content suited to their preferences. Daniel Terdiman summed up the situation in his review of the work for CNET:

Pariser explains the dynamic we all face online today: that no two people’s Web searches, even on the same topics, return the same results. That’s because search engines and other sites are basing what they send back on our previous searches, the sites we visit, ads we click on, preferences we indicate, and much more. Not to mention the fact that we are more and more shielded from viewpoints counter to our own.

Another concern involves the so-called democratization of information: that trending search results automatically rank higher and appear more prominently. However, this algorithm opens the doors for fake news and bogus content when enough publishers produce it. Two years ago, Guardian journalist Carole Cadwalladr typed “are Jews” into Google’s search bar. Its predictive system returned disturbing recommendations.

It offered me a choice of potential questions it thought I might want to ask: “are jews a race?”, “are jews white?”, “are jews christians?”, and finally, “are jews evil?”
Are Jews evil? It’s not a question I’ve ever thought of asking. I hadn’t gone looking for it. But there it was. I press enter. A page of results appears. This was Google’s question. And this was Google’s answer: Jews are evil. Because there, on my screen, was the proof: an entire page of results, nine out of 10 of which “confirm” this. The top result, from a site called Listovative, has the headline: “Top 10 Major Reasons Why People Hate Jews.” I click on it: “Jews today have taken over marketing, militia, medicinal, technological, media, industrial, cinema challenges etc and continue to face the worlds [sic] envy through unexplained success stories given their inglorious past and vermin like repression all over Europe.”

Of course, we can’t ignore the human element. Facebook’s Trending Topics platform and a few web content filtering solutions have used people to bolster the capabilities of digital algorithms. In the latter case, the ACLU discovered that personal bias negatively impacted the way content was parsed. The campaign achieved public visibility in a movement called “Don’t Filter Me.” The civil rights group found that all references to the LGBTQ community were being blocked as offensive content by certain software providers, even when the sites provided positive and educational information.

Facebook confronted backlash for similar reasons, as the New York Times reported in 2016:

The Silicon Valley company faces allegations of intentionally suppressing conservative news from appearing on Trending Topics. In a rough-and-tumble presidential election year in which social media is playing an increasingly large role, some Republican leaders say they have lost trust in Facebook’s ability to maintain impartiality as a communication and news platform.

This type of curation is, to Eli Pariser’s point, excluding content that may run contrary to our preferred worldview. But in the absence of a well-rounded and comprehensive library of information, we can’t genuinely develop a robust worldview.

Effective Curation Requires Sweeping, Differing Inputs

Diversity is the solution to these issues. We need differing perspectives to make sound decisions. Too often in this industry, we see that talent curation is little more than attempting to play into a hiring manager’s particular preferences. Recruiters may be delivering the candidates that a client wants, but not necessarily the people they need. Here are some ways to better curate the resumes you’re receiving.

Blind Resumes

Why not follow  in the footsteps of industry leaders such as Deloitte, Household Bank, KPMG and many government agencies. How does it work? All contact and personal information is removed. These details—especially when they hint at age, gender, culture, hobbies and other attributes—can form unconscious biases in the minds of reviewers. A blind resume includes only skills, objectives, work experience and education. Truly blind resumes even edit details of education to display only academic data, such as degrees achieved and honors awarded. Removing the name of the university or institution can go far in preventing bias.

Marketplace Models

An open marketplace encourages anyone to apply and helps remove intrinsic bias. Rather than scrutinizing a worker’s background, this model gets to the heart of what matters most: finding talent who perform and produce results at the highest levels. Often times, we discover that what a worker may lack in terms of established skills or longevity, he or she makes up for through motivation, a willingness to learn, a desire to succeed and a drive to overachieve.

Redefine Cultural Fit

Placing workers in environments that complement their values, support needs, work-life goals and ongoing development leads to success. When focusing on fit, emphasize characteristics that demonstrate alignment -- how a worker’s aspirations and potential contributions mesh with the prevailing mission and values of the company.

Interview the Interviewers

Consult with clients to work out a series of questions that various hiring managers would ask candidates. This gives you the opportunity to identify and weed out biases within the group, standardize the questions and create relevant evaluation criteria. This process ensures that interviewers are on the same page in determining what an ideal candidate looks like. More importantly, this strategy helps you formalize a set of checks and balances against bias.

Curation Takes a Village

We naturally gravitate toward like-minded individuals. We love the feeling of having our viewpoints validated. But that should never come at the cost of objectivity, truth, and diverse opinions. Otherwise, we’re only having our biases validated. Matching candidates to business cultures is an essential technique in modern hiring models. And it requires thoughtful introspection, examination, and formulating questions that will enlighten candidates, hiring managers, and recruiters in the process. Yet there exists a dark side to curation: feeding into desires that could lead to ethnically, ideologically, racially, and sexually similar cultures. An environment that propels a status quo ultimately stagnates within these limitations. Fortunately, we have the know-how and resources to curate an equitable hiring process that will expose us to new ideas and new innovations.


Image by PIRO4D from Pixabay 


Continue reading

Our newsletter

Get great curated articles every week.

Combine sections from Ollie's vast component library and create beautiful, detailed pages.
No spam!

Innovative talent powering a brighter future.