Rachel Feintzeig and Vanessa Furhmans report in the Wall Street Journal:
With job recruits’ social-media histories readily available, more employers are trying to head off or prepare for such controversies, especially with high-profile hires. In a 2017 survey of more than 2,300 hiring managers and human-resources executives, 70% said they screened candidates’ social-media histories—up from 60% the previous year. One-third said they had found discriminatory comments that caused them not to hire someone.
More companies are scouring job candidates’ online personas for racist and other red-flag comments. That hasn’t kept social-media trails from morphing into hiring minefields.
The New York Times has become the latest employer to grapple with a public furor after announcing last week it hired journalist Sarah Jeong as a technology writer for its editorial board. Soon after, tweets she had posted between 2013 and 2015 disparaging white people—in one instance, using the hashtag #cancelwhitepeople—resurfaced and a social-media outcry ensued.
Defending its hire, the Times said in a written statement that it knew about Ms. Jeong’s tweets before hiring her and that “she understands that this type of rhetoric is not acceptable at The Times.” On Twitter, Ms. Jeong said she regretted the posts, which she said had been aimed at online harassers, not a general audience.
Last month, Walt Disney Co. cut ties with “Guardians of the Galaxy” director James Gunn after years-old, inflammatory tweets of his were resurfaced. Mr. Gunn said that the comments were “wildly insensitive” and “don’t reflect the person I am today.” In recent weeks, three Major League Baseball players apologized for unearthed racist and antigay tweets written during their high-school days.
With job recruits’ social-media histories readily available, more employers are trying to head off or prepare for such controversies, especially with high-profile hires. In a 2017 survey of more than 2,300 hiring managers and human-resources executives by jobs website CareerBuilder, 70% said they screened candidates’ social-media histories—up from 60% the previous year. One-third said they had found discriminatory comments that caused them not to hire someone.
Yet social-media screening remains one the murkiest aspects of the hiring process, according to experts in employment law and human resources. Both too little and too much scouring present legal and reputational pitfalls, they say. And though many employers have firm policies on whether to test for drug use or conduct criminal-record checks, fewer have consistent guidelines on how they vet and assess prospective employees’ online histories.
“It’s really all across the board,” said Jason Hanold, whose executive-search firm Hanold Associates specializes in recruiting human-resources executives. “And it’s often determined by the proclivities of the individual” in charge.
Whereas the Times said it had discussed Ms. Jeong’s social-media history with her during the hiring process, the newspaper said it hadn’t been aware of some old, inflammatory tweets posted by journalist and essayist Quinn Norton before hiring her to its editorial board in February. They included the use of racial slurs and referred to her friendship with a neo-Nazi. Hours after a social-media storm erupted over her hiring announcement, the Times and Ms. Norton said she would no longer join the company. After the episode with Ms. Norton, the Times stepped up its efforts to review the social-media histories of its hires, a person familiar with the matter said.
In an emailed response to The Wall Street Journal on Sunday, Ms. Norton said that in stripping the tweets of their context, online critics had wrongly cast what had been intended as antiracist remarks as the opposite. She said that screening people’s social-media histories wouldn’t necessarily catch remarks that could be distorted and inflamed by crowds on the internet.
Companies hiring talent abroad run the risk of violating digital privacy laws, such as the European Union’s new General Data Protection Regulation, which employers are still trying to suss out, said Laurie Ruettimann, a human-resources consultant. And she said that
hiring managers poring over applicants’ Facebook pages and tweets could easily learn other details—such as a prospect’s religion, disability or pregnancy—that could bias hiring decisions and that, by law, can’t be taken into account.
“I am incredibly hesitant…to recommend that anyone go on Google and judge anyone for anything because there’s no consistent standard,” said Ms. Ruettimann, who suggests employers stick to traditional third-party background checks that don’t include social-media searches.
She recommends that individuals who have posted offensive content online not bring up the issue with a potential employer. Sharing more positive content on sites that are likely to get traction on search engines can help, she said. “Start contributing in a way that’s healthy.”
Minnesota-based employment lawyer Kate Bischoff recommends job seekers delete offensive comments in the hopes they don’t come up.
Still, once something is online it can live forever—including by other people saving and reposting it. That also suggests it could be better for employees and job applicants to be upfront about the past.
Ms. Bischoff advises her corporate clients to direct human-resources employees not involved in the hiring decision to screen for inflammatory or polarizing social-media comments. That way, the direct hiring managers aren’t exposed to other information that could bias them. If they do find a troubling tweet or other material, human-resources staffers usually ask the applicant to explain the matter.
“I’m more concerned about those issues being a problem if we didn’t look at it,” said Ms. Bischoff. In some states, she added, ignoring a public history of, say, racist tweets could legally expose an employer if that new hire, in turn, discriminated against minorities.
“If you bring that risk into organizations, you could be liable for it,” said Ms. Bischoff, who added that she has helped clients fire five people for racist tweets over the past year.
A more common issue to come up in social-media screenings these days is highly politicized rants that risk alienating fellow employees or clients, Mr. Hanold said. “Employers tend to avoid that like the plague,” he said.
Some companies are turning to software companies such as Fama Technologies Inc., which uses an algorithm to sift through applicants’ or employees’ public social-media posts. So far this year, Fama says, it has screened more than 10 million pieces of online content for corporate clients. Of the people screened, 10% had content that raised flags for bigotry, racism or hate speech, while 14% had flags for potential misogyny or sexism.
“Companies are starting to wake up to the fact that this risk is real,” said Fama’s chief executive and co-founder, Ben Mones. “There isn’t a question like, ‘Are you racist?’ on a job application. Most people who are racist don’t think they’re racist."
1 comments:
Graduates use company listings to broaden their job search and set up notifications to quickly identify professional possibilities and openings how to chatgpt. They might look attentively at possible employers through social media postings and profiles chat gpt login.
Post a Comment