The US Department of Defense is investing in deepfake detection
“This work represents an important step forward in strengthening our information advantage as we combat sophisticated disinformation campaigns and synthetic media threats,” Bustamante said. Hive was selected from a field of 36 companies to join the Department of Defense Testing its deepfake detection and attribution technology together, the contract could enable the department to detect and counter AI deception at scale.
Kevin Kuo, CEO of Hive AI, said that defending against deep fakes is a matter of “life and death.” “This is how cyber warfare has evolved.”
Hive’s technology has been trained on a vast amount of content, some of it generated by artificial intelligence and some of it not. It picks up signals and patterns in AI-generated content that are invisible to the human eye but detectable by AI models.
“It turns out that every image produced by these generators has this pattern if you know where to look,” Guo said. The Hive team is constantly tracking new models and updating its technology accordingly.
The U.S. Department of Defense said in a statement that the tools and methods developed through this initiative have the potential to be applied more broadly to not only solve defense-specific challenges but also to protect civilian agencies from disinformation, fraud and deception. .
Siwei Lyu, a professor of computer science and engineering at the University at Buffalo, said Hive’s technology provides state-of-the-art performance in detecting content generated by artificial intelligence. He was not involved in Hive’s work but tested its detection tools.
Ben Zhao, a professor at the University of Chicago, has also conducted independent research evaluative Hive AI’s deepfake technology agrees, but points out that it’s far from foolproof.
“Hive is certainly better than most commercial entities and some of the research techniques we’ve tried, but we’ve also shown that it’s not difficult to circumvent,” Zhao said. The team found that attackers could tamper with images by bypassing Hive detection.
2024-12-05 14:00:00