Skip to content ↓

Topic

Social media

Download RSS feed: News Articles / In the Media / Audio

Displaying 31 - 45 of 147 news clips related to this topic.
Show:

Fast Company

Fast Company reporter Mark Wilson writes that a new study by researchers from MIT and Google finds that simple user experience interventions can help stop people from sharing misinformation on Covid-19. “Researchers introduced several different prompts through a simple popup window, all with a single goal: to get people to think about the accuracy of what they’re about to share,” writes Wilson. “When primed to consider a story’s accuracy, people were up to 20% less likely to share a piece of fake news.”

Fast Company

Fast Company reporter Arianne Cohen writes that a new study by MIT researchers explores how polite corrections to online misinformation can lead to further sharing of incorrect information. The researchers found that after being politely corrected for sharing inaccurate information, “tweeters’ accuracy declined further—and even more so when they were corrected by someone matching their political leanings.”

Boston Globe

A new study by MIT researchers finds that attempting to correct misinformation on social media can lead to users sharing even less accurate information, reports Hiawatha Bray for The Boston Globe. “Being publicly corrected by another person makes them less attentive to what they retweet,” explains Prof. David Rand, “because it shifts their attention not to accuracy but toward social things like being embarrassed.”

Motherboard

A new study by MIT researchers finds that correcting people who were spreading misinformation on Twitter led to people retweeting and sharing even more misinformation, reports Matthew Gault for Motherboard. Prof. David Rand explains that the research is aimed at identifying “what kinds of interventions increase versus decrease the quality of news people share. There is no question that social media has changed the way people interact. But understanding how exactly it's changed things is really difficult.” 

Slate

Graduate student Crystal Lee speaks with Slate reporter Rebecca Onion about a new study that illustrates how social media users have used data visualizations to argue against public health measures during the Covid-19 pandemic. “The biggest point of diversion is the focus on different metrics—on deaths, rather than cases,” says Lee. “They focus on a very small slice of the data. And even then, they contest metrics in ways I think are fundamentally misleading.”

The New Yorker

New Yorker reporter Benjamin Wallace-Wells spotlights new research from the MIT Initiative on the Digital Economy, which shows “just telling people the accurate immunization rates in their country increased, by five per cent, the number who said that they would get the vaccine.”

Fox News

A new study by MIT researchers finds that political beliefs can help bring people together on social media networks, reports Brooke Crothers for Fox News. On both sides, users were roughly three times more likely to form social ties with strangers who identify with the same party, compared to "counter-partisans.”

The Washington Post

In an opinion piece for The Washington Post, Prof. Sinan Aral discusses the rise of GameStop stock and the real-world impact of social media. “The past week’s events exposed several potential sources of economic instability,” writes Aral. “If the social media crowd’s opinion alone drives market value, the market goes where the herd takes it, without the constraints of economic reality.”

TechCrunch

TechCrunch reporter Devin Coldewey writes that a new study co-authored by MIT researchers finds that debunking misinformation is the most effective method of addressing false news on social media platforms. “The team speculated as to the cause of this, suggesting that it fits with other indications that people are more likely to incorporate feedback into a preexisting judgment rather than alter that judgment as it’s being formed,” writes Coldewey. 

Forbes

Forbes contributor Wayne Rush spotlights Prof. David Rand’s research examining how to most effectively combat the spread of misinformation. “They forget to think about whether it’s true, but rather how many likes they’ll get,” says Rand of why people share misinformation on social media. “Another feature of social media is that people are more likely to be friends with people who share common ideas.”

Fortune

Prof. Sinan Aral speaks with Fortune reporter Danielle Abril about how social media companies can more effectively respond to misinformation and hate speech, following the attack on the U.S. Capitol. “This has been a steady momentum build of reaction by social media platforms,” says Aral. “This is a culmination of an understanding of social media companies that they need to do more [and] that the laissez-faire attitude isn’t going to cut it.”

Yahoo! News

Professor Sinan Aral discusses the role of social media during the attack on the U.S. Capitol. Aral notes that social media companies “have a responsibility to make sure that any information that is advocating violence, supporting violence, advocating the violent overthrow of the government, and so on, be stemmed," says Aral. "This is a content moderation decision.” 

Wired

Prof. Sinan Aral’s new book, “The Hype Machine,” has been selected as one of the best books of the year about AI by Wired. Gilad Edelman notes that Aral’s book is “an engagingly written shortcut to expertise on what the likes of Facebook and Twitter are doing to our brains and our society.”

TechCrunch

Prof. Sinan Aral speaks with Danny Crichton of TechCrunch about his new book, “The Hype Machine,” which explores the future of social media. Aral notes that he believes a starting point “for solving the social media crisis is creating competition in the social media economy.” 

New York Times

Prof. Sinan Aral speaks with New York Times editorial board member Greg Bensinger about how social media platforms can reduce the spread of misinformation. “Human-in-the-loop moderation is the right solution,” says Aral. “It’s not a simple silver bullet, but it would give accountability where these companies have in the past blamed software.”