迷你词

以文会友

伪造视频支持对拜登精神健康的虚假信念

| 暂无评论

特朗普竞选及其代理人抓住了民主党提名人乔·拜登的年龄,并一直把他描绘成精神上不适合总统。

【原文】

This article is republished here with permission from The Conversation. This content is shared here because the topic may interest Snopes readers; it does not, however, represent the work of Snopes fact-checkers or editors.


From Ronald Reagan in 1984 to Bob Dole in 1996 and even Hillary Clinton in 2016, candidate health has become a common theme across recent U.S. presidential campaigns.

The issue is poised to take on added significance this fall. No matter who wins, the U.S. is set to inaugurate its oldest president by a wide margin.

But disinformation is unlikely to reach everyone equally. Research from 2016 found that people were most likely to engage with disinformation when it supported their preferred candidate, an observation especially true for Trump supporters. If this extends to 2020, these videos might serve mostly to reinforce Trump voters’ beliefs about Biden’s cognitive demise rather than create new doubts within the wider electorate.

Disinformation can also affect campaigns beyond swaying voters. It can influence the agendas of news outlets. If manipulated videos succeed in bringing questions about Biden’s cognitive capabilities into the spotlight, they could detract from the Biden campaign’s core message by pressing the campaign to reassure voters about his mental health. The campaign has had to respond to these questions even before the recent circulation of the manipulated videos.

Altered video arms race

Deepfakes and cheapfakes have the potential to affect how people see and understand the world. The threats, whether to election integrity or international security, are real and have caught the attention of Congress and the Pentagon.

There are several technological efforts aimed at spotting and ultimately blocking altered videos. There has been some progress, but it’s a difficult problem. The technology is evolving into an arms race between the fakers and the detectors. For example, after researchers developed a way to identify deepfakes by looking at eye-blinking patterns, the technology adapted.

There are also efforts by the news media to come to grips with altered video in the fact-checking process. The Washington Post has developed a fact-checkers’ guide called Seeing Isn’t Believing, and Duke University’s Reporters’ Lab is developing MediaReview, a system for for fact-checkers to tag manipulated videos to alert search engines and social media platforms.

If the fakers pull ahead of the detectors in this altered video arms race, the 2020 election could come to be seen as the start of an era when people can no longer be certain that what they see is what they can believe.The Conversation


Dustin Carnahan, Assistant Professor of Communication, Michigan State University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

发表评论