Videos deemed to have been edited using machine-learning algorithms, and are realistic enough to fool people into believing the fictitious situation, will be removed from the antisocial platform, we're assured.
As the US presidential election looms, the social network announced this week that it will be enforcing the removal of manipulated material that could be used to trick the average user, but stressed that the policy would not extend to content that is intended as parody or satire.
One example of the difference in content which contravenes the guidelines and that which does not would be videos shared previous year of House Speaker Nancy Pelosi slurring her speech. The company noted that all posts, including videos, which violate its Community Guidelines will still be banned.
Democratic Florida Rep. Darren Soto also pressed Bickert on why Facebook wouldn't take down the Pelosi video.
Content labelled "false" is not always removed from newsfeeds but is downgraded so fewer people see it - alongside a warning explaining why the post is misleading.
Experts said the crudely edited clip was more of a "cheap fake" than a deepfake.
Deepfakes are now banned on Facebook and Instagram, the company announced in a blog post Monday night. Google, which owns YouTube, is also researching how to better detect deepfakes and other manipulated media.
Evincing the new policy Facebook's Monika Bickert, veep of global policy management, remarked: "Videos that don't meet these standards for removal are still eligible for review by one of our independent third-party fact-checkers".
Under Facebook's former policy, the Pelosi video was flagged as "false" but was not taken down.
The video wasn't technically a deepfake, which would mean it was completely fabricated, but still introduced Facebook to the kinds of misinformation it'll face heading into the 2020 election.
These so-called deepfakes thus make people appear to say or do things they hadn't actually said or done.
Twitter, which has been another hotbed for misinformation and altered videos, said it's in the process of creating a policy for "synthetic and manipulated media", which would include deepfakes and other doctored videos. "Some deep fakes don't involve words, just actions like deep fake sex videos".
A Facebook spokesperson initially told CNN Business that a politician would be allowed to use deepfake video in a paid ad but then corrected themselves after this article was first published and said no manipulated media, including deepfakes, would be allowed in an politician's ad per Facebook policy. However, this won't apply across the board, with only manipulated media created to deceive being removed.