Microsoft vs. Deepfakes: The Company Announces New Software to Detect Fakes

0

With fewer than 100 days before the U.S. presidential election, Microsoft announced it has developed a new way to combat disinformation on the internet, including a new system of detecting deepfakes — synthetic audio or video that mimics a real recording.

Microsoft said Tuesday it is launching the “Microsoft Video Authenticator,” which it says can analyze photos and videos to provide a confidence score about whether the image has been manipulated. The authenticator will both alert people if an image is likely fake or assure them when it’s authentic, Microsoft said.

“The fact that they [the deepfakes] are generated by A.I. that can continue to learn makes it inevitable that they will beat conventional detection technology,” the company said in a statement. “However, in the short run, such as the upcoming U.S. election, advanced detection technologies can be a useful tool to help discerning users identify deepfakes.”

Microsoft said the new software was built in partnership with the Defending Democracy Program, which fights disinformation, protects voting, and secures campaigns.

Tech and privacy advocates have been sounding the alarm on the rise of deepfakes and its political implications for several years, as the technology has gotten noticeably harder to detect. Some companies have even started developing deepfake services, ostensibly for entertainment purposes.

In February, Twitter announced they would ban media that was “synthetic or fake,” and Facebook made a similar move in January.

Editors’ Recommendations






For the latest tech news and updates, Install TechCodex App, and follow us on Google News,  Facebook, and Twitter. Also, if you like our efforts, consider sharing this story with your friends, this will encourage us to bring more exciting updates for you.

Source

Get real time updates directly on you device, subscribe now.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. AcceptRead More