In recent years, there has been an explosion of research trying to understand misinformation: what it is, how it operates, and what impacts it has on the world. On the surface, this roiling field seems to produce as many paradoxes and conflicting results as it does potential insights. For example, some studies suggest that bots (internet robots) play a limited role, whereas other studies suggest that bots drive the diffusion of misinformation. It is ironic that the field of research on misinformation has come to resemble the very thing it studies. What is true? What is actually known about misinformation and its impacts on society? A single research paper may interrogate only one aspect of what is a complex misinformation machine, making it tempting to see other papers as providing competing views, when they are, in fact, often entirely complementary windows into a much larger process. Grinberg et al. illustrate the necessity of thinking of misinformation as a process.
Grinberg et al. show that online, mostly political misinformation is shared and seen by only a very small fraction of Twitter users...to small communities that engage with questionable media sources. To do this, they used a clever method to find humans (as opposed to bots) on Twitter: They matched U.S. voter registration records against Twitter accounts. Each Twitter user's political orientation was then estimated using the celebrity and news accounts they followed.
There is a key blind spot in the current research: rumors. Although there has been work on the broad phenomenon of rumoring online and its connection to misinformation, there is a serious need for a better understanding of how fake news stories transform into rumors and to what extent these rumors can amplify beliefs and infiltrate other communities.
Progress here might help explain one of the most curious and unexplained findings of the Grinberg et al. paper: that conservatives are significantly more inclined to share and see fake news than liberals. Perhaps this is the whole story: Conservatives have a weakness for fake news. More likely, though, is that liberals embed misinformation in different ways and spread it through means that we, as of now, do not have reliable ways of measuring. When we begin to uncover these mechanisms, it will be important to place them within the context of the much larger misinformation system within which they operate.Here is the Grinberg et al.abstract:
The spread of fake news on social media became a public concern in the United States after the 2016 presidential election. We examined exposure to and sharing of fake news by registered voters on Twitter and found that engagement with fake news sources was extremely concentrated. Only 1% of individuals accounted for 80% of fake news source exposures, and 0.1% accounted for nearly 80% of fake news sources shared. Individuals most likely to engage with fake news sources were conservative leaning, older, and highly engaged with political news. A cluster of fake news sources shared overlapping audiences on the extreme right, but for people across the political spectrum, most political news exposure still came from mainstream media outlets.
from Deric's MindBlog http://bit.ly/2RmyBqR
via https://ifttt.com/ IFTTT
No comments:
Post a Comment