That said, the degree to which Google manipulates research results to suit its own ends is disturbing. JL
Kirsten Grind and colleagues report in the Wall Street Journal:
Google has increasingly re-engineered and interfered with search results to a far greater degree than the company and its executives have acknowledged.Those actions often come in response to pressure from businesses, outside interest groups and governments. They have increased since the 2016 election and the rise of online misinformation. Google engineers regularly make behind-the-scenes adjustments, increasingly layering on top of its basic search results.Google made algorithmic changes to its search results that favor big businesses over smaller ones, and in at least one case made changes on behalf of a major advertiser.
Every minute, an estimated 3.8 million queries are typed into Google, prompting its algorithms to spit out results for hotel rates or breast-cancer treatments or the latest news about President Trump.
They are arguably the most powerful lines of computer code in the global economy, controlling how much of the world accesses information found on the internet, and the starting point for billions of dollars of commerce.
Twenty years ago, Google founders began building a goliath on the premise that its search algorithms could do a better job combing the web for useful information than humans. Google executives have said repeatedly—in private meetings with outside groups and in congressional testimony—that the algorithms are objective and essentially autonomous, unsullied by human biases or business considerations.
The company states in a Google blog, “We do not use human curation to collect or arrange the results on a page.” It says it can’t divulge details about how the algorithms work because the company is involved in a long-running and high-stakes battle with those who want to profit by gaming the system.
But that message often clashes with what happens behind the scenes. Over time, Google has increasingly re-engineered and interfered with search results to a far greater degree than the company and its executives have acknowledged, a Wall Street Journal investigation has found.
Those actions often come in response to pressure from businesses, outside interest groups and governments around the world. They have increased sharply since the 2016 election and the rise of online misinformation, the Journal found.
Google’s evolving approach marks a shift from its founding philosophy of “organizing the world’s information,” to one that is far more active in deciding how that information should appear.
More than 100 interviews and the Journal’s own testing of Google’s search results reveal:
• Google made algorithmic changes to its search results that favor big businesses over smaller ones, and in at least one case made changes on behalf of a major advertiser, eBay Inc., contrary to its public position that it never takes that type of action. The company also boosts some major websites, such as Amazon.com Inc. and Facebook Inc., according to people familiar with the matter.
• Google engineers regularly make behind-the-scenes adjustments to other information the company is increasingly layering on top of its basic search results. These features include auto-complete suggestions, boxes called “knowledge panels” and “featured snippets,” and news results, which aren’t subject to the same company policies limiting what engineers can remove or change.
• Despite publicly denying doing so, Google keeps blacklists to remove certain sites or prevent others from surfacing in certain types of results. These moves are separate from those that block sites as required by U.S. or foreign law, such as those featuring child abuse or with copyright infringement, and from changes designed to demote spam sites, which attempt to game the system to appear higher in results.
• In auto-complete, the feature that predicts search terms as the user types a query, Google’s engineers have created algorithms and blacklists to weed out more-incendiary suggestions for controversial subjects, such as abortion or immigration, in effect filtering out inflammatory results on high-profile topics.
• Google employees and executives, including co-founders Larry Page and Sergey Brin, have disagreed on how much to intervene on search results and to what extent. Employees can push for revisions in specific search results, including on topics such as vaccinations and autism.
• To evaluate its search results, Google employs thousands of low-paid contractors whose purpose the company says is to assess the quality of the algorithms’ rankings. Even so, contractors said Google gave feedback to these workers to convey what it considered to be the correct ranking of results, and they revised their assessments accordingly, according to contractors interviewed by the Journal. The contractors’ collective evaluations are then used to adjust algorithms.
THE JOURNAL’S FINDINGS undercut one of Google’s core defenses against global regulators worried about how it wields its immense power—that the company doesn’t exert editorial control over what it shows users. Regulators’ areas of concern include anticompetitive practices, political bias and online misinformation.
Far from being autonomous computer programs oblivious to outside pressure, Google’s algorithms are subject to regular tinkering from executives and engineers who are trying to deliver relevant search results, while also pleasing a wide variety of powerful interests and driving its parent company’s more than $30 billion in annual profit. Google is now the most highly trafficked website in the world, surpassing 90% of the market share for all search engines. The market capitalization of its parent, Alphabet Inc., is more than $900 billion.
Google made more than 3,200 changes to its algorithms in 2018, up from more than 2,400 in 2017 and from about 500 in 2010, according to Google and a person familiar with the matter. Google said 15% of queries today are for words, or combinations of words, that the company has never seen before, putting more demands on engineers to make sure the algorithms deliver useful results.
A Google spokeswoman disputed the Journal’s conclusions, saying, “We do today what we have done all along, provide relevant results from the most reliable sources available.”
Lara Levin, the spokeswoman, said the company is transparent in its guidelines for evaluators and in what it designs the algorithms to do.
AS PART OF ITS EXAMINATION, the Journal tested Google’s search results over several weeks this summer and compared them with results from two competing search engines, Microsoft Corp. ’s Bing and DuckDuckGo, a privacy-focused company that builds its results from syndicated feeds from other companies, including Verizon Communications Inc. ’s Yahoo search engine.
The testing showed wide discrepancies in how Google handled auto-complete queries and some of what Google calls organic search results—the list of websites that Google says are algorithmically sorted by relevance in response to a user’s query. (Read about the methodology for the Journal’s analysis.)
Ms. Levin, the Google spokeswoman, declined to comment on specific results of the Journal’s testing. In general, she said, “Our systems aim to provide relevant results from authoritative sources,” adding that organic search results alone “are not representative of the information made accessible via search.”
The Journal tested the auto-complete feature, which Google says draws from its vast database of search information to predict what a user intends to type, as well as data such as a user’s location and search history. The testing showed the extent to which Google doesn’t offer certain suggestions compared with other search engines.
Typing “Joe Biden is” or “Donald Trump is” in auto-complete, Google offered predicted language that was more innocuous than the other search engines. Similar differences were shown for other presidential candidates tested by the Journal.
The Journal also tested several search terms in auto-complete such as “immigrants are” and “abortion is.” Google’s predicted searches were less inflammatory than those of the other engines.
Gabriel Weinberg, DuckDuckGo’s chief executive, said that for certain words or phrases entered into the search box, such as ones that might be offensive, DuckDuckGo has decided to block all of its auto-complete suggestions, which it licenses from Yahoo. He said that type of block wasn’t triggered in the Journal’s searches for Donald Trump or Joe Biden.
A spokeswoman for Yahoo operator Verizon Media said, “We are committed to delivering a safe and trustworthy search experience to our users and partners, and we work diligently to ensure that search suggestions within Yahoo Search reflect that commitment.”
Said a Microsoft spokeswoman: “We work to ensure that our search results are as relevant, balanced, and trustworthy as possible, and in general, our rule is to minimize interference with the normal algorithmic operation.”
In other areas of the Journal analysis, Google’s results in organic search and news for a number of hot-button terms and politicians’ names showed prominent representation of both conservative and liberal news outlets.
ALGORITHMS ARE effectively recipes in code form, providing step-by-step instructions for how computers should solve certain problems. They drive not just the internet, but the apps that populate phones and tablets.
Algorithms determine which friends show up in a Facebook user’s news feed, which Twitter posts are most likely to go viral and how much an Uber ride should cost during rush hour as opposed to the middle of the night. They are used by banks to screen loan applications, businesses to look for the best job applicants and insurers to determine a person’s expected lifespan.
In the beginning, their power was rarely questioned. At Google in particular, its innovative algorithms ranked web content in a way that was groundbreaking, and hugely lucrative. The company aimed to make the web useful while relying on the assumption that code alone could do the heavy lifting of figuring out how to rank information.
But bad actors are increasingly trying to manipulate search results, businesses are trying to game the system and misinformation is rampant across tech platforms. Google found itself facing a version of the pressures on Facebook, which long said it was just connecting people but has been forced to more aggressively police content on its platform.
A 2016 internal investigation at Google showed between a 10th of a percent and a quarter of a percent of search queries were returning misinformation of some kind, according to one Google executive who works on search. It was a small number percentage-wise, but given the huge volume of Google searches it would amount to nearly two billion searches a year.
By comparison, Facebook faced congressional scrutiny for Russian misinformation that was viewed by 126 million users.
Google’s Ms. Levin said the number includes not just misinformation but also a “wide range of other content defined as lowest quality.” She disputed the Journal’s estimate of the number of searches that were affected. The company doesn’t disclose metrics on Google searches.
Google assembled a small SWAT team to work on the problem that became known internally as “Project Owl.” Borrowing from the strategy used earlier to fight spam, engineers worked to emphasize factors on a page that are proxies for “authoritativeness,” effectively pushing down pages that don’t display those attributes.
Other tech platforms, including Facebook, have taken a more aggressive approach, manually removing problem content and devising rules around what it defines as misinformation. Google, for its part, said its role “indexing” content versus “hosting” content, as Facebook does, means it shouldn’t take a more active role.
One Google search executive described the problem of defining misinformation as incredibly hard, and said the company didn’t want to go down the path of figuring it out.
Around the time Google started addressing issues such as misinformation, it started fielding even more complaints, to the point where human interference became more routine, according to people familiar with the matter, putting it in the position of arbitrating some of society’s most complicated issues. Some changes to search results might be considered reasonable—boosting trusted websites like the National Suicide Prevention Lifeline, for example—but Google has made little disclosure about when changes are made, or why.
Businesses, lawmakers and advertisers are worried about fairness and competition within the markets where Google is a leading player, and as a result its operations are coming under heavy scrutiny.
The U.S. Justice Department earlier this year opened an antitrust probe, in which Google’s search policies and practices are expected to be areas of focus. Google executives have twice been called to testify before Congress in the past year over concerns about political bias. In the European Union, Google has been fined more than $9 billion in the past three years for anticompetitive practices, including allegedly using its search engine to favor its own products.
In response, Google has said it faces tough competition in a dynamic tech sector, and that its behavior is aimed at helping create choice for consumers, not hurting rivals. The company is currently appealing the decisions against it in the EU, and it has denied claims of political bias.
GOOGLE RARELY RELEASES detailed information on algorithm changes, and its moves have bedeviled companies and interest groups, who feel they are operating at the tech giant’s whim.
In one change hotly contested within Google, engineers opted to tilt results to favor prominent businesses over smaller ones, based on the argument that customers were more likely to get what they wanted at larger outlets. One effect of the change was a boost to Amazon’s products, even if the items had been discontinued, according to people familiar with the matter.
The issue came up repeatedly over the years at meetings in which Google search executives discuss algorithm changes. Each time, they chose not to reverse the change, according to a person familiar with the matter.
Google engineers said it is widely acknowledged within the company that search is a zero-sum game: A change that helps lift one result inevitably pushes down another, often with considerable impact on the businesses involved.
Ms. Levin said there is no guidance in Google’s rater guidelines that suggest big sites are inherently more authoritative than small sites. “It’s inaccurate to suggest we did not address issues like discontinued products appearing high up in results,” she added.
Many of the changes within Google have coincided with its gradual evolution from a company with an engineering-focused, almost academic culture into an advertising behemoth and one of the most profitable companies in the world. Advertising revenue—which includes ads on search as well as on other products such as maps and YouTube—was $116.3 billion last year.
Some very big advertisers received direct advice on how to improve their organic search results, a perk not available to businesses with no contacts at Google, according to people familiar with the matter. In some cases, that help included sending in search engineers to explain a problem, they said.
“If they have an [algorithm] update, our teams may get on the phone with them and they will go through it,” said Jeremy Cornfeldt, the chief executive of the Americas of Dentsu Inc.’s iProspect, which Mr. Cornfeldt said is one of Google’s largest advertising agency clients. He said the agency doesn’t get information Google wouldn’t share publicly. Among others it can disclose, iProspect represents Levi Strauss & Co., Alcon Inc. and Wolverine World Wide Inc.
One former executive at a Fortune 500 company that received such advice said Google frequently adjusts how it crawls the web and ranks pages to deal with specific big websites.
Google updates its index of some sites such as Facebook and Amazon more frequently, a move that helps them appear more often in search results, according to a person familiar with the matter.
“There’s this idea that the search algorithm is all neutral and goes out and combs the web and comes back and shows what it found, and that’s total BS,” the former executive said. “Google deals with special cases all the time.”
Ms. Levin, the Google spokeswoman, said the search team’s practice is to not provide specialized guidance to website owners. She also said that faster indexing of a site isn’t a guarantee that it will rank higher. “We prioritize issues based on impact, not any commercial relationships,” she said.
Online marketplace eBay had long relied on Google for as much as a third of its internet traffic. In 2014, traffic suddenly plummeted—contributing to a $200 million hit in its revenue guidance for that year.
Google told the company it had made a decision to lower the ranking of a large number of eBay pages that were a big source of traffic.
EBay executives debated pulling their quarterly advertising spending of around $30 million from Google to protest, but ultimately decided to step up lobbying pressure on Google, with employees and executives calling and meeting with search engineers, according to people familiar with the matter. A similar episode had hit traffic several years earlier, and eBay had marshaled its lobbying might to persuade Google to give it advice about how to fix the problem, even relying on a former Google staffer who was then employed at eBay to work his contacts, according to one of those people.
This time, Google ultimately agreed to improve the ranking of a number of pages it had demoted while eBay completed a broader revision of its website to make the pages more “useful and relevant,” the people said. The revision was arduous and costly to complete, one of the people said, adding that eBay was later hit by other downrankings that Google didn’t help with.
“We’ve experienced significant and consistent drops in Google SEO for many years, which has been disproportionally detrimental to those small businesses that we support,” an eBay spokesman said. SEO, or search-engine optimization, is the practice of trying to generate more search-engine traffic for a website.
Google’s Ms. Levin declined to comment on eBay.
Companies without eBay’s clout had different experiences.
Dan Baxter can remember the exact moment his website, DealCatcher, was caught in a Google algorithm change. It was 6 p.m. on Sunday, Feb. 17. Mr. Baxter, who founded the Wilmington, Del., coupon website 20 years ago, got a call from one of his 12 employees the next morning.
“Have you looked at our traffic?” the worker asked, frantically, Mr. Baxter recalled. It was suddenly down 93% for no apparent reason. That Saturday, DealCatcher saw about 31,000 visitors from Google. Now it was posting about 2,400. It had disappeared almost entirely on Google search.
Mr. Baxter said he didn’t know whom to contact at Google, so he hired a consultant to help him identify what might have happened. The expert reached out directly to a contact at Google but never heard back. Mr. Baxter tried posting to a YouTube forum hosted by a Google “webmaster” to ask if it might have been a technical problem, but the webmaster seemed to shoot down that idea.
One month to the day after his traffic disappeared, it inexplicably came back, and he still doesn’t know why.
“You’re kind of just left in the dark, and that’s the scary part of the whole thing,” said Mr. Baxter.
Google’s Ms. Levin declined to comment on DealCatcher.
(The Wall Street Journal is owned by News Corp, which has complained publicly about Google’s moves to play down news sites that charge for subscriptions. Google ended the policy after intensive lobbying by News Corp and other paywalled publishers. More recently, News Corp has called for an “algorithm review board” to oversee Google, Facebook and other tech giants. News Corp has a commercial agreement to supply news through Facebook, and Dow Jones & Co., publisher of The Wall Street Journal, has a commercial agreement to supply news through Apple services. Google’s Ms. Levin and News Corp declined to comment.)
GOOGLE IN RECENT months has made additional efforts to clarify how its services operate by updating general information on its site. At the end of October it posted a new video titled “How Google Search Works.”
Jonathan Zittrain, a Harvard Law School professor and faculty director of the Berkman Klein Center for Internet & Society, said Google has poorly defined how often or when it intervenes on search results. The company’s argument that it can’t reveal those details because it is fighting spam “seems nuts,” said Mr. Zittrain.
“That argument may have made sense 10 or 15 years ago but not anymore,” he said. “That’s called ‘security through obscurity,’ ” a reference to the now-unfashionable engineering idea that systems can be made more secure by restricting information about how they operate.
Google’s Ms. Levin said “extreme transparency has historically proven to empower bad actors in a way that hurts our users and website owners who play by the rules.”
“Building a service like this means making tens of thousands of really, really complicated human decisions, and that’s not what people think,” said John Bowers, a research associate at the Berkman Klein Center.
On one extreme, those decisions at Google are made by the world’s most accomplished and highest-paid engineers, whose job is to turn the dials within millions of lines of complex code. On the other is an army of more than 10,000 contract workers, who work from home and get paid by the hour to evaluate search results.
The rankings supplied by the contractors, who work from a Google manual that runs to hundreds of pages, can indirectly move a site higher or lower in results, according to people familiar with the matter. And their collective responses are measured by Google executives and used to affect the search algorithms.
One of those evaluators was Zack Langley, now a 27-year-old logistics manager at a tour company in New Orleans. Mr. Langley got a one-year contract in the spring of 2016 evaluating Google’s search results through Lionbridge Technologies Inc., one of several companies Google and other tech platforms use for contract work.
During his time as a contractor, Mr. Langley said he never had any contact with anyone at Google, nor was he told what his results would be used for. Like all of Google’s evaluators, he signed a nondisclosure agreement. He made $13.50 an hour and worked up to 20 hours a week from home.
Sometimes working in his pajamas, Mr. Langley was given hundreds of real search results and told to use his judgment to rate them according to quality, reputation and usefulness, among other factors.
At one point, Mr. Langley said he was unhappy with the search results for “best way to kill myself,” which were turning up links that were like “how-to” manuals. He said he down-ranked all the other results for suicide until the National Suicide Prevention Lifeline was the No. 1 result.
Soon after, Mr. Langley said, Google sent a note through Lionbridge saying the hotline should be ranked as the top result across all searches related to suicide, so that the collective rankings of the evaluators would adjust the algorithms to deliver that result. He said he never learned if his actions had anything to do with the change.
Mr. Langley said it seemed like Google wanted him to change content on search so Google would have what he called plausible deniability about making those decisions. He said contractors would get notes from Lionbridge that he believed came from Google telling them the “correct” results on other searches.
He said that in late 2016, as the election approached, Google officials got more involved in dictating the best results, although not necessarily on issues related to the campaign. “They used to have a hands-off approach, and then it seemed to change,” he said.
Ms. Levin, the Google spokeswoman, said the company “long ago evolved our approach to collecting feedback on these types of queries, which help us develop algorithmic solutions and features in this area.” She added that, “we provide updates to our rater guidelines to ensure all raters are following the same general framework.”
Lionbridge didn’t reply to requests for comment.
AT GOOGLE, EMPLOYEES routinely use the company’s internal message boards as well as a form called “go/bad” to push for changes in specific search results. (Go/bad is a reporting system meant to allow Google staff to point out problematic search results.)
One of the first hot-button issues surfaced in 2015, according to people familiar with the matter, when some employees complained that a search for “how do vaccines cause autism” delivered misinformation through sites that oppose vaccinations.
At least one employee defended the result, writing that Google should “let the algorithms decide” what shows up, according to one person familiar with the matter. Instead, the people said, Google made a change so that the first result is a site called howdovaccinescauseautism.com—which states on its home page in large black letters, “They f—ing don’t.” (The phrase has become a meme within Google.)
Google’s Ms. Levin declined to comment.
In the fall of 2018, the conservative news site Breitbart News Network posted a leaked video of Google executives, including Mr. Brin and Google CEO Sundar Pichai, upset and addressing staffers following President Trump’s election two years earlier. A group of Google employees noticed the video was appearing on the 12th page of search results when Googling “leaked Google video Trump,” which made it seem like Google was burying it. They complained on one of the company’s internal message boards, according to people familiar with the matter. Shortly after, the leaked video began appearing higher in search results.
“When we receive reports of our product not behaving as people might expect, we investigate to see if there’s any useful insight to inform future improvements,” said Ms. Levin.
FROM GOOGLE’S FOUNDING, Messrs. Page and Brin knew that ranking webpages was a matter of opinion. “The importance of a Web page is an inherently subjective matter, which depends on the [readers’] interests, knowledge and attitudes,” they wrote in their 1998 paper introducing the PageRank algorithm, the founding system that launched the search engine.
PageRank, they wrote, would measure the level of human interest and attention, but it would do so “objectively and mechanically.” They contended that the system would mathematically measure the relevance of a site by the number of times other relevant sites linked to it on the web.
Today, PageRank has been updated and subsumed into more than 200 different algorithms, attuned to hundreds of signals, now used by Google. (The company replaced PageRank in 2005 with a newer version that could better keep up with the vast traffic that the site was attracting. Internally, it was called “PageRankNG,” ostensibly named for “next generation,” according to people familiar with the matter. In public, the company still points to PageRank—and on its website links to the original algorithm published by Messrs. Page and Brin—in explaining how search works. “The original insight and notion of using link patterns is something that we still use in our systems,” said Ms. Levin.)
By the early 2000s, spammers were overwhelming Google’s algorithms with tactics that made their sites appear more popular than they were, skewing search results. Messrs. Page and Brin disagreed over how to tackle the problem.
Mr. Brin argued against human intervention, contending that Google should deliver the most accurate results as delivered by the algorithms, and that the algorithms should be tweaked only in the most extreme cases. Mr. Page countered that the user experience was getting damaged when users encountered spam rather than useful results, according to people familiar with the matter.
Google already had been taking what the company calls “manual actions” against specific websites that were abusing the algorithm. In that process, Google engineers demote a website’s ranking by changing its specific “weighting.” For example, if a website is artificially boosted by paying other websites to link to it, a behavior that Google frowns upon, Google engineers could turn down the dial on that specific weighting. The company could also blacklist a website, or remove it altogether.
Mr. Brin still opposed making large-scale efforts to fight spam, because it involved more human intervention. Mr. Brin, whose parents were Jewish émigrés from the former Soviet Union, even personally decided to allow anti-Semitic sites that were in the results for the query “Jew,” according to people familiar with the decision. Google posted a disclaimer with results for that query saying, “Our search results are generated completely objectively and are independent of the beliefs and preferences of those who work at Google.”
Finally, in 2004, in the bathroom one day at Google’s headquarters in Mountain View, Calif., Mr. Page approached Ben Gomes, one of Google’s early search executives, to express support for his efforts fighting spam. “Just do what you need to do,” said Mr. Page, according to a person familiar with the conversation. “Sergey is going to ruin this f—ing company.”
Ms. Levin, the Google spokeswoman, said Messrs. Page, Brin and Gomes declined to comment.
After that, the company revised its algorithms to fight spam and loosened rules for manual interventions, according to people familiar with the matter.
Google has guidelines for changing its ranking algorithms, a grueling process called the “launch committee.” Google executives have pointed to this process in a general way in congressional testimony when asked about algorithm changes.
The process is like defending a thesis, and the meetings can be contentious, according to people familiar with them.
In part because the process is laborious, some engineers aim to avoid it if they can, one of these people said, and small changes can sometimes get pushed through without the committee’s approval. Mr. Gomes is on the committee that decides whether to approve the changes, and other senior officials sometimes attend as well.
Google’s Ms. Levin said not every algorithm change is discussed in a meeting, but “there are other processes for reviewing more straightforward launches at different levels of the organization,” such as an email review. Those reviews still involve members of the launch committee, she said.
Today, Google discloses only a few of the factors being measured by its algorithms. Known ones include “freshness,” which gives preference to recently created content for searches relating to things such as breaking news or a sports event. Another is where a user is located—if a user searches for “zoo,” Google engineers want the algorithms to provide the best zoo in the user’s area. Language signals—how meanings change when words are used together, such as April and fools—are among the most important, as they help determine what a user is actually asking for.
Other important signals have included the length of time users would stay on pages they clicked on before clicking back to Google, according to a former Google employee. Long stays would boost a page’s ranking. Quick bounce backs, indicating a site wasn’t relevant, would severely hurt a ranking, the former employee said.
Over the years, Google’s database recording this user activity has become a competitive advantage, helping cement its position in the search market. Other search engines don’t have the vast quantity of data that is available to Google, search’s market-leader.
That makes the impact of its operating decisions immense. When Pinterest Inc. filed to go public earlier this year, it said that “search engines, such as Google, may modify their algorithms and policies or enforce those policies in ways that are detrimental to us.” It added: “Our ability to appeal these actions is limited.” A spokeswoman for Pinterest declined to comment.
Search-engine optimization consultants have proliferated to try to decipher Google’s signals on behalf of large and small businesses. But even those experts said the algorithms remain borderline indecipherable. “It’s black magic,” said Glenn Gabe, an SEO expert who has spent years analyzing Google’s algorithms and tried to help DealCatcher find a solution to its drop in traffic earlier this year.
ALONG WITH ADVERTISEMENTS, Google’s own features now take up large amounts of space on the first page of results—with few obvious distinctions for users. These include news headlines and videos across the top, information panels along the side and “People also ask” boxes highlighting related questions.
Google engineers view the features as separate products from Google search, and there is less resistance to manually changing their content in response to outside requests, according to people familiar with the matter.
These features have become more prominent as Google attempts to keep users on its results page, where ads are placed, instead of losing the users as they click through to other sites. In September, about 55% of Google searches on mobile were “no-click” searches, according to research firm Jumpshot, meaning users never left the results page.
Two typical features on the results page—knowledge panels, which are collections of relevant information about people, events or other things; and featured snippets, which are highlighted results that Google thinks will contain content a user is looking for—are areas where Google engineers make changes to fix results, the Journal found.
In April, the conservative Heritage Foundation called Google to complain that a coming movie called “Unplanned” had been labeled in a knowledge panel as “propaganda,” according to a person familiar with the matter. The film is about a former Planned Parenthood director who had a change of heart and became pro-life.
After the Heritage Foundation complained to a contact at Google, the company apologized and removed “propaganda” from the description, that person said.
Google’s Ms. Levin said the change “was not the result of pressure from an outside group, it was a violation of the feature’s policy.”
On the auto-complete feature, Google reached a confidential settlement in France in 2012 with several outside groups that had complained it was anti-Semitic that Google was suggesting the French word for “Jew” when searchers typed in the name of several prominent politicians. Google agreed to “algorithmically mitigate” such suggestions as part of a pact that barred the parties from disclosing its terms, according to people familiar with the matter.
In recent years, Google changed its auto-complete algorithms to remove “sensitive and disparaging remarks.” The policy, now detailed on its website, says that Google doesn’t allow predictions that may be related to “harassment, bullying, threats, inappropriate sexualization, or predictions that expose private or sensitive information.”
GOOGLE HAS BECOME more open about its moderation of auto-complete but still doesn’t disclose its use of blacklists. Kevin Gibbs, who created auto-complete in 2004 when he was a Google engineer, originally developed the list of terms that wouldn’t be suggested, even if they were the most popular queries that independent algorithms would normally supply.
For example, if a user searched “Britney Spears”—a popular search on Google at the time—Mr. Gibbs didn’t want a piece of human anatomy or the description of a sex act to appear when someone started typing the singer’s name. The unfiltered results were “kind of horrible,” Mr. Gibbs said in an interview.
He said deciding what should and shouldn’t be on the list was challenging. “It was uncomfortable, and I felt a lot of pressure,” said Mr. Gibbs, who worked on auto-complete for about a year, and left the company in 2012. “I wanted to make sure it represented the world fairly and didn’t leave out any groups.”
Google still maintains lists of phrases and terms that are manually blacklisted from auto-complete, according to people familiar with the matter.
The company internally has a “clearly articulated set of policies” about what terms or phrases might be blacklisted in auto-complete, and that it follows those rules, according to a person familiar with the matter.
Blacklists also affect the results in organic search and Google News, as well as other search products, such as Web answers and knowledge panels, according to people familiar with the matter.
Google has said in congressional testimony it doesn’t use blacklists. Asked in a 2018 hearing whether Google had ever blacklisted a “company, group, individual or outlet…for political reasons,” Karan Bhatia, Google’s vice president of public policy, responded: “No, ma’am, we don’t use blacklists/whitelists to influence our search results,” according to the transcript.
Ms. Levin said those statements were related to blacklists targeting political groups, which she said the company doesn’t keep.
Google’s first blacklists date to the early 2000s, when the company made a list of spam sites that it removed from its index, one of those people said. This means the sites wouldn’t appear in search results.
Engineers known as “maintainers” are authorized to make and approve changes to blacklists. It takes at least two people to do this; one person makes the change, while a second approves it, according to the person familiar with the matter.
The Journal reviewed a draft policy document from August 2018 that outlines how Google employees should implement an anti-misinformation blacklist aimed at blocking certain publishers from appearing in Google News and other search products. The document says engineers should focus on “a publisher misrepresenting their ownership or web properties” and having “deceptive content”—that is, sites that actively aim to mislead—as opposed to those that have inaccurate content.
“The purpose of the blacklist will be to bar the sites from surfacing in any Search feature or news product sites,” the document states.
Ms. Levin said Google does “not manually determine the order of any search result.” She said sites that don’t adhere to Google News “inclusion policies” are “not eligible to appear on news surfaces or in information boxes in Search.”
SOME INDIVIDUALS and companies said changes made by the company seem ad hoc, or inconsistent. People familiar with the matter said Google increasingly will make manual or algorithmic changes that aren’t acknowledged publicly in order to maintain that it isn’t affected by outside pressure.
“It’s very convenient for us to say that the algorithms make all the decisions,” said one former Google executive.
In March 2017, Google updated the guidelines it gives contractors who evaluate search results, instructing them for the first time to give low-quality ratings to sites “created with the sole purpose of promoting hate or violence against a group of people”—something that would help adjust Google algorithms to lower those sites in search.
The next year, the company broadened the guidance to any pages that promote such hate or violence, even if it isn’t the page’s sole purpose and even if it is “expressed in polite or even academic-sounding language.”
Google has resisted entirely removing some content that outsiders complained should be blocked. In May 2018, Ignacio Wenley Palacios, a Spain-based lawyer working for the Lawfare Project, a nonprofit that funds litigation to protect Jewish people, asked Google to remove an anti-Semitic article lauding a German Holocaust denier posted on a Spanish-language neo-Nazi blog.
The company declined. In an email to Mr. Wenley Palacios, lawyers for Google contended that “while such content is detestable” it isn’t “manifestly illegal” in Spain.
Mr. Wenley Palacios then filed a lawsuit, but in the spring of this year, before the suit could be heard, he said, Google lawyers told him the company was changing its policy on such removals in Spain.
According to Mr. Wenley Palacios, the lawyers said the firm would now remove from searches conducted in Spain any links to Holocaust denial and other content that could hurt vulnerable minorities, once they are pointed out to the company. The results would still be accessible outside of Spain. He said both sides agreed to dismiss the case.
Google’s Ms. Levin described the action as a “legal removal” in accordance with local law. Holocaust denial isn’t illegal in Spain, but if it is coupled with an intent to spread hate, it can fall under Spanish criminal law banning certain forms of hate speech.
“Google used to say, ‘We don’t approve of the content, but that’s what it is,’ ” Mr. Wenley Palacios said. “That has changed dramatically.”
Health policy consultant Greg Williams said he helped lead a campaign to push Google to make changes that would stifle misleading results for queries such as “rehab.”
At the time, in 2017, addiction centers with spotty records were constantly showing up in search results, typically the first place family members and addicts go in search of help.
Google routed Diane Hentges several times over the last year to call centers as she desperately researched drug addiction treatment centers for her 22-year-old son, she said.
Each time she called one of the facilities listed on Google, a customer-service representative would ask for her financial information, but the representatives weren’t seemingly attached to any legitimate company.
“If you look at a place on Google, it sends you straight to a call center,” Ms. Hentges said, adding that parents who are struggling with a child with addiction “will do anything to get our child healthy. We’ll believe anything.”
After intense lobbying by Mr. Williams and others, Google changed its ad policy around such queries. But addiction industry officials also noticed a significant change to Google search results. Many searches for “rehab” or related terms began returning the website for the Substance Abuse and Mental Health Services Administration, the national help hotline run by the U.S. Department of Health and Human Services, as the top result.
A spokesman for SAMHSA said the agency had a partnership with Google.
Google never acknowledged the change. Ms. Levin said that “resources are not listed because of any type of partnership” and that “we have algorithmic solutions designed to prioritize authoritative resources (including official hotlines) in our results for queries like these as well as for suicide and self-harm queries.”
Google’s search algorithms have been a major focus of Hollywood in its effort to fight pirated TV shows and movies.
Studios “saw this as the potential death knell of their business,” said Dan Glickman, chairman and chief executive of the Motion Picture Association of America from 2004 to 2010. The association has been a public critic of Google. “A hundred million dollars to market a major movie could be thrown away if someone could stream it illegally online.”
Google received a record 1.6 million requests to remove web pages for copyright issues last year, according to the company’s published Transparency Report and a Journal analysis. Those requests pertained to more than 740 million pages, about 12 times the number of web pages it was asked to take down in 2012.
A decade ago, in concession to the industry, Google removed “download” from its auto-complete suggestions after the name of a movie or TV show, so that at least it wouldn’t be encouraging searches for pirated content.
In 2012, it applied a filter to search results that would lower the ranking of sites that received a large number of piracy complaints under U.S. copyright law. That effectively pushed many pirate sites off the front page of results for general searches for movies or music, although it still showed them when a user specifically typed in the pirate site names.
In recent months the industry has gotten more cooperation from Google on piracy in search results than at any point in the organization’s history, according to people familiar with the matter.
“Google is under great cosmic pressure, as is Facebook,” Mr. Glickman said. “These are companies that are in danger of being federally regulated to an extent that they never anticipated.”
Mr. Pichai, who became CEO of Google in 2015, is more willing to entertain complaints about the search results from outside parties than Messrs. Page and Brin, the co-founders, according to people familiar with his leadership.
Google’s Ms. Levin said Mr. Pichai’s “style of engaging and listening to feedback has not shifted. He has always been very open to feedback.”
CRITICISM ALLEGING political bias in Google’s search results has sharpened since the 2016 election.
Interest groups from the right and left have besieged Google with questions about content displayed in search results and about why the company’s algorithms returned certain information over others.
Google appointed an executive in Washington, Max Pappas, to handle complaints from conservative groups, according to people familiar with the matter. Mr. Pappas works with Google engineers on changes to search when conservative viewpoints aren’t being represented fairly, according to interest groups interviewed by the Journal, although that is just one part of his job.
“Conservatives need people they can go to at these companies,” said Dan Gainor, an executive at the conservative Media Research Center, which has complained about various issues to Google.
Google also appointed at least one other executive in Washington, Chanelle Hardy, to work with outside liberal groups, according to people familiar with the matter.
Ms. Levin said both positions have existed for many years. She said in general Google believes it’s “the responsible thing to do” to understand feedback from the groups and said Google’s algorithms and policies don’t attempt to make any judgment based on the political leanings of a website.
Mr. Pappas declined to comment, and Ms. Hardy didn’t reply to a request for comment.
Over the past year, abortion-rights groups have complained about search results that turned up the websites of what are known as “crisis pregnancy centers,” organizations that counsel women against having abortions, according to people familiar with the matter.Share Your Thoughts
Does Google give you what you expect in search results? Join the discussion below.
One of the complaining organizations was Naral Pro-Choice America, which tracks the activities of anti-abortion groups through its opposition research department, said spokeswoman Kristin Ford.
Naral complained to Google and other tech platforms that some of the ads, posts and search results from crisis pregnancy centers are misleading and deceptive, she said. Some of the organizations claimed to offer abortions and then counseled women against it. “They do not disclose what their agenda is,” Ms. Ford said.
In June, Google updated its advertising policies related to abortion, saying that advertisers must state whether they provide abortions or not, according to its website. Ms. Ford said Naral wasn’t told in advance of the policy change.
Ms. Levin said Google didn’t implement any changes with regard to how crisis pregnancy centers rank for abortion queries.
The Journal tested the term “abortion” in organic search results over 17 days in July and August. Thirty-nine percent of all results on the first page had the hostname www.plannedparenthood.org, the site of Planned Parenthood Federation of America, the nonprofit, abortion-rights organization.
By comparison, 14% of Bing’s first page of search results and 16% of DuckDuckGo’s first page of results were from Planned Parenthood.
Ms. Levin said Google doesn’t have any particular ranking implementations aimed at promoting Planned Parenthood.
The practice of creating blacklists for certain types of sites or searches has fueled cries of political bias from some Google engineers and right-wing publications that said they have viewed portions of the blacklists. Some of the websites Google appears to have targeted in Google News were conservative sites and blogs, according to documents reviewed by the Journal. In one partial blacklist reviewed by the Journal, some conservative and right-wing websites, including The Gateway Pundit and The United West, were included on a list of hundreds of websites that wouldn’t appear in news or featured products, although they could appear in organic search results.
Google has said repeatedly it doesn’t make decisions based on politics, and current and former employees told the Journal they haven’t seen evidence of political bias. And yet, they said, Google’s shifting policies on interference—and its lack of transparency about them—inevitably force employees to become arbiters of what is acceptable, a dilemma that opens the door to charges of bias or favoritism.
Google’s Ms. Levin declined to comment.
DEMANDS FROM GOVERNMENTS for changes have grown rapidly since 2016.
From 2010 to 2018, Google fielded such requests from countries including the U.S. to remove 685,000 links from what Google calls web search. The requests came from courts or other authorities that said the links broke local laws or should be removed for other reasons.
Nearly 78% of those removal requests have been since the beginning of 2016, according to reports that Google publishes on its website. Google’s ultimate actions on those requests weren’t disclosed.
Russia has been by far the most prolific, demanding the removal of about 255,000 links from search last year, three-quarters of all government requests for removal from Google search in that period, the data show. Nearly all of the country’s requests came under an information-security law Russia put into effect in late 2017, according to a Journal examination of disclosures in a database run by the Berkman Klein Center.
Google said the Russian law doesn’t allow it to disclose which URLs were requested to be removed. A person familiar with the matter said the removal demands are for content ruled illegal in Russia for a variety of reasons, such as for promoting drug use or encouraging suicide.
Requests can include demands to remove links to information the government defines as extremist, which can be used to target political opposition, the person said.
Google, whose staff reviews the requests, at times declines those that appear focused on political opposition, the person said, adding that in those cases, it tries not to draw attention to its decisions to avoid provoking Russian regulators.
The approach has led to stiff internal debate. On one side, some Google employees say that the company shouldn’t cooperate at all with takedown requests from countries such as Russia or Turkey. Others say it is important to follow the laws of countries where they are based.
“There is a real question internally about whether a private company should be making these calls,” the person said.
Google’s Ms. Levin said, “Maximizing access to information has always been a core principle of Search, and that hasn’t changed.”
Google’s culture of publicly resisting demands to change results has diminished, current and former employees said. A few years ago, the company dismantled a global team focused on free-speech issues that, among other things, publicized the company’s legal battles to fight changes to search results, in part because Google had lost several of those battles in court, according to a person familiar with the change.
“Free expression was no longer a winner,” the person said.
0 comments:
Post a Comment