Why it matters: There are many consequences associated with the advancement of artificial intelligence, both good and bad. "Deepfakes" – AI-edited clips which superimpose one individual's face over another's in a video – seem to fall into the latter category. However, there's good news now: researchers from the State University of New York have trained an AI to detect these fake videos.

There have always been ethical concerns associated with the advancement of AI and machine learning technology, but they've only become more pronounced in recent years.

The use of AI for military purposes is obviously one concern, but as we've seen recently with the rise of "Deepfakes," neural networks and AI can be used for far more subtle, and not immediately harmful purposes.

The term Deepfake, for the unaware, usually refers to AI-created, fake pornography. This pornography typically involves a popular actor or actress' face being superimposed over that of a porn model's.

Communities focused on the creation and distribution of this sort of content have cropped up around the internet. As a result, services like Twitter, Discord, and Pornhub have deemed this pornography "nonconsensual," opting to remove it from their platforms.

With enough time and technological advancement, political speeches could be falsified, and "fake news" could become a genuine threat.

However, pornography isn't the only form of video Deepfake tech could spread to. With enough time and technological advancement, political speeches could be falsified, and "fake news" could become a genuine threat.

That said, researchers from the State University of New York may have found a way to fight fire with fire. The individuals in question taught an AI to detect 'Deepfakes' by closely monitoring eye blinking; something Deepfake clips don't reliably replicate because they are typically trained with images rather than videos.

This is certainly a positive development, but it remains to be seen whether or not researchers will be able to keep up with the rapid advancement of Deepfakes in the future.

Image courtesy IndieWire