The DoD has developed the first forensic tools to detect deepfakes. The new tools use AI to detect the deepfakes that are usually created with AI or machine learning tools. One key discovery that helps make detection easier is that most deepfakes use still images of people and then overlay the original video with the deepfake. These still images don't blink and as a result the videos don't blink either. This is just one step in detecting the fakes and DARPA will continue to find more with their Media Forensics program. Other methods will explore strange head movements, odd eye color and other flaws in the deepfakes. The arrival of these forensics tools may simply signal the beginning of an AI-powered arms race between video forgers and digital sleuths. A key problem, says Farid, is that machine-learning systems can be trained to outmaneuver forensics tools.