In the era of disinformation, social media users are forced to take an increasingly skeptical eye to their feeds in order to separate fact from fiction.

But the latest trend in online manipulation tests the fundamental belief of 鈥渟eeing is believing.鈥

Deepfakes -- videos that have been manipulated using artificial intelligence (AI) -- could become the next big threat to truth, according to experts.

The technology is used to generate falsified videos by either swapping one face for another, or by allowing someone's motions and voice to be mapped onto another person. In other words, it allows people to fabricate videos of people 鈥 or, more problematically, politicians 鈥 saying things they never said, or doing things they never did.

Experts say these videos could pose a very real risk when placed in the hands of those who aim to sway public opinion -- especially in the midst of a federal election.

鈥淭he architects of these sorts of online manipulated media often play upon the fears and biases of a certain subset of the electorate that they are trying to impact,鈥 John Villasenor, professor of engineering at UCLA and , told CTVNews.ca by phone from Los Angeles Monday.

鈥淚f you happen to believe that a particular candidate is terrible, then you will be more willing to give credence to a video that confirms that bias.鈥

Deepfakes have already made their way into the political sphere in the U.S., the most notable example being a doctored video of House of Representatives Speaker Nancy Pelosi attacking President Donald Trump being altered to look like she was slurring her words.

Though Villasenor referred to this video as more of a 鈥渃heapfake鈥 than a 鈥渄eepfake鈥 because it was only altered to affect Pelosi鈥檚 speech, it was still viewed more than two million times, sparking outrage and suggestions Pelosi was drunk.

Canadian politicians have also been targeted by deepfakes.

In June, posted multiple deepfakes featuring Conservative Leader Andrew Scheer and Ontario Premier Doug Ford. One of the videos featured Scheer鈥檚 face dubbed over a 1980s public service announcement about drug use, presented by Pee Wee Herman.

Though the videos err on the side of humour, rather than deception, the Communications Securities Establishment (CSE), Canada鈥檚 cybersecurity agency, warns that deepfakes are a threat to political parties and candidates.

The agency included deepfakes in the 2019 update to the Cyber Threats to Canada鈥檚 Democracy report noting, 鈥淔oreign adversaries can use this new technology to try to discredit candidates, and influence voters by, for example, creating forged footage of a candidate delivering a controversial speech or showing the candidate in embarrassing situations.鈥

While Villasenor says that the public needs to be more discerning of the content they view on social media, he warns that an increase in fake videos could lead more people to call authentic videos into question.

鈥淲e鈥檙e moving into an era where we all need to keep in mind that just because we see something on video doesn鈥檛 mean it actually happened in how it appears in the video,鈥 said Villasenor.

鈥淸But] another side effect of this is that things that actually happened can be called into question by someone that might want to deny the reality of the event. It not only offers opportunity for people who might want to fabricate things; it undermines our understanding of truth from both directions.鈥

While the risk of deepfakes popping up throughout the federal election campaign is real, not all experts agree the tech poses the risk of swinging Canada鈥檚 election results.

鈥淒eepfake technology has the ability to create chaos, especially if it was targeted ata high profile politician in a tight race. However, the technology is still not quite sophisticated enough to justify the cost,鈥 Claire Wardle, , an organization working to address the challenges of disinformation, told CTVNews.ca by emailTuesday.

鈥淭here are many cheaper (read: free) ways to cause harm.鈥

Wardle noted that manipulated audio clips also present a significant risk moving forward.

鈥淲e're much more trusting of our ears and it's easier to fool people than video, where the quality has to be incredibly high,鈥 Wardle said.

Contact us

See a story or post circulating on social media that you think may be disinformation or in need of fact-checking?

Let us know by sharing with us the link to the post or the source of the information.

Email us by clicking here or visit our Newsbreaker page.

Please include your full name, city and province.