Friday, 23 August 2019

Dr. Epstein, Political Bias, & Google Search Results

I’m a little confused by claims made by Dr. Robert Epstein and his assertion, based upon a single study of 95 participants, that Google somehow intentionally biased the results shown before the 2016 U.S. presidential election. And therefore, likely impacted the election results itself.

That’s a huge assertion to make. One would hope that an esteemed researcher such as Dr. Epstein would have the scientific data to back it up. Unfortunately, I don’t see it.

Science is only objective up until the point where a scientist acknowledges and accounts for her or his own biases. Science is not based on a preset agenda, or an attempt to settle a score. I’m not certain Dr. Epstein has done kept his own biases in check in his apparent witch hunt to take down Google for offering “biased” search results.

Search Engines Have Always Been Biased

Google has always offered biased search results. If you don’t understand that this has to be the case with any search engine, then you might need a quick refresher course on how search engines work.

There is no such thing as unbiased search results. All search engines use proprietary trade-secret algorithms to ensure you see what the search engine company believes makes for the “best” results. “Best” has — since the beginning of search engines online back in the early 1990s — always been a subjective term. There is no single objective ranking of websites that says, “Always show this website first for this search engine because it is clearly the best result.”

And guess what — people love that! That’s why Google is on top of the search engine pile, because it does indeed offer the results that are apparently the most relevant to most people. The minute Google stops offering such relevant results, a new search engine can and will take its place. (Anyone remember Alta Vista, Excite, or even Yahoo? [And no, Yahoo doesn’t do search anymore — its results are provided by Bing.])

What Does Bias in Search Engine Results Look Like?

Unbeknownst to many, search engines don’t show the exact same results to the same query asked by two different people. Most search engines, including Google, use complex personalization factors and a complex psychographic profile in order to further sort and present results it thinks are most relevant to you.

In practice, this means that my search for “depression symptoms” may return a different result set than your search on the exact same terms. If you don’t carefully control for this in your methodology, your results will be meaningless and tainted.

Epstein & Robertson (2015) found in a series of laboratory (not real-world) experiments, when they artificially manipulated search engine results pages, they could influence subjects’ voter preferences over a short duration of time. It did not research any actual search engine pages. And it ignored the layout and makeup of modern search engine result pages. Real search result pages feature multiple advertisements (that anyone can purchase) at the top of the page before any organic results.

These researchers’ results are not surprising in that they echo what any search engine optimization (SEO) expert would tell you — position matters on a search engine results page. Websites get tons more traffic if they are #1, #2, or #3 versus #9 — or worse yet, on the second page of results.

In a second laboratory experiment, the same researcher demonstrated methods (again, using a completely fake search engine — not Google) in which the effect they coined — the Search Engine Manipulation Effect (SEME) — could be suppressed (through timely alerts shown to users).

Google Helped Hillary Win?

In 2017, Epstein & Robertson weren’t content to demonstrate the obvious any longer — that ranking positions matter on search engine results pages. They took it a step further and conducted a study of 95 Americans (only 21 of whom identified as “undecided” in the upcoming presidential election) in 2016 and their search habits.

In a white paper published only to their own website, Epstein & Robertson make the extraordinary claim:

[…W]e have found that between May and November 2016, search results displayed in response to a wide range of election-related search terms were, on average, biased in Mrs. Clinton’s favor in all 10 search-result positions.

Published as a “white paper” and not a peer-reviewed journal study, this raised a bunch of red flags.1

There was little in the way of methodology explained in the study. This includes no information about what was done to limit the personalization of search results (since you want to control for that independent variable), nor what search terms they actually used. In fact, in reading the two previous studies these researchers published, it’s not even clear they’re aware how search engines work in terms of their monetization strategies, constant weekly algorithm changes, and personalization of search results.

There is also some apparent sloppiness in the researcher’s efforts, in my opinion. There is no rationale given for the specific 25-day period of time they used to examine in the study, versus any other period of time. And in fact, they acknowledge they didn’t really look all that closely at the majority of datapoints they had gathered. The researchers ignored 7 months’ worth of research data to focus only on the 3 weeks before the election.2

They also made the decision, post-hoc, to discard all Gmail.com based data because of anomalies in that data. Those anomalies happened to show no such bias, which they attributed to either a set of “bots” or — wait for it — intentional sabotage on Google’s part.

Since there’s a significant minority of legitimate users who use Gmail, these rationales to throw out all Gmail.com-derived data seem questionable at best. It is, in my opinion, a horrible research decision to have made, but one that coincidentally also ensured that the researchers found significance in their data.

But here’s the real kicker:

Extrapolating from the mathematics introduced in this report, in articles published in February 2016and thereafter, the lead author of the PNAS study predicted that a pro-Clinton bias in Google’s search results would, over time, shift at least 2.6 million votes to Clinton.

There is zero mathematics in their white paper. There are a bunch of descriptive statistics, but those statistics barely speak to what procedures or modeling the researchers actually used to arrive at the conclusions that they did.

The researchers’ “evidence of systematic bias in the 2016 presidential election?” A small sampling of modeling data based upon 95 Americans (minus the Gmail.com users whose data they tossed post-hoc).

In short, in my opinion this is exactly the kind of shoddy, shady, horribly-designed research that passes for “proof” in this day and age. Why would researchers conduct such a seemingly politically-biased study, and also draw conclusions that they have no actual direct proof of?3

Perhaps There’s an Axe to Grind?

Researchers are human. And humans sometimes have an axe to grind. You don’t have to go far to find one of Epstein’s possible particular axes.

Prior to 2012, Epstein showed little interest in search engines or how they worked. He published on a wide variety of psychological, relationship, and mental health topics and wrote about them for mainstream websites.

Then in early 2012, Epstein’s personal website was a recipient of a malware warning that appeared when users tried to access his site from Google. Google displays these alerts to steer users away from potentially malicious websites.

But this incident apparently got under Epstein’s skin in some way because suddenly he’s writing multiple articles in the fall of 2012 about the need to regulate Google. This from a researcher who had never written a single word about search engines before. I find the timing interesting.

In short, Epstein has been advocating for the federal government’s regulation of Google for the past seven years. It wouldn’t be too hard to imagine a hypothetical researcher designing studies to support her or his beliefs.

The Upshot of Search Engine Bias

Search engines have always been biased, and always will be because they are subjective tools meant to help get users to information or entertainment. The minute big government wants to start overseeing my search results is the minute I turn to a search engine where such government filtering isn’t done.

It also helps to keep in mind hypothetical meddling versus real meddling in U.S. politics. While Epstein is insinuating that Google is manipulating its political search results to favor candidates it wants elected into office, we have actual proof of Facebook manipulating the 2016 presidential election through Russian-sponsored organizations purchasing millions of dollars of false advertising on its platform.

Interestingly, Epstein doesn’t seem to have much interest in that. Maybe that’s because Facebook has never wronged him as Google once did.

 

For further information

Politifact: Donald Trump wrong on Google manipulating election results

References

Epstein & Robertson. (2017). Suppressing the Search Engine Manipulation Effect (SEME). Proc. ACM Hum.-Comput. Interact., 1(2), 42.

Epstein & Robertson. (2017). A Method for Detecting Bias in Search Rankings, with Evidence of Systematic Bias Related to the 2016 Presidential Election. White paper published by AIBRT, Epstein’s organization.

Epstein & Robertson. (2015). The search engine manipulation effect (SEME) and itspossible impact on the outcomes of elections. PNAS, 10.1073/pnas.1419828112

Footnotes:

  1. When asked about the lack of peer-reviewed studies, Epstein replied to me, “I also have problems of both urgency and quantity: I’ve completed or have in progress so many different studies of new forms of online influence (I’m studying seven different types of influence at the moment — SEME and six others) that I’ve decided to summarize my findings in conference papers, white papers and, at some point, in book form, rather than spend what little time remains to me on the painfully slow academic publications process. When I stumble onto another new form of online influence, it takes me a year or two, at least, to understand and quantify it. (I haven’t even gotten around to beginning experiments on a half dozen new forms of influence I know about.) Adding another year or two onto that process to publish in a journal seems imprudent given my age and given how potentially important these discoveries are for humanity.”
  2. The researchers claimed this was due to what they said were recruiting issues and refining their procedures. Which begs the question — shouldn’t their procedures had been refined in a pilot study first, as most researchers would have done?
  3. Or, if you want to be pedantic, have minimal proof of based upon a tiny sample of just 95 users’ searches — minus some number of Gmail.com subjects — over the course of 25 days.


from World of Psychology https://ift.tt/2Nrzn7E
via https://ifttt.com/ IFTTT

No comments:

Post a Comment