Center for Strategic Assessment and forecasts

Autonomous non-profit organization

Home / Defence and security / / Articles
Deepface called the biggest threat from the use of artificial intelligence
Material posted: Publication date: 11-08-2020
The experts recognized gipface the most dangerous use of artificial intelligence from the point of view of possible crimes of different severity. The researchers identified the 20 most harmful ways to use AI in the next 15 years.

University College London ranked ways to use AI in the degree of potential harm, benefits, intruders, ease of implementation and complexity prevent, reported on the website of the institution. Deepface difficult to detect and stop them spreading, they may pursue different goals from public discredit a man to steal money by deception.

Fake content can cause distrust of people to the audio and visual evidence, which in itself is detrimental to society. Among other uses of AI unmanned vehicles as weapons, spear phishing, fake news created AI, gathering information to blackmail and disruption of AI-systems.

Researchers have compiled a list of 20 crimes with the use of AI found in scientific articles, the news, popular culture and literature. Then gathered an expert group of 31 people with experience in the field of AI. It included scientists, entrepreneurs, police officers, officials and employees of state security bodies.

"People now spend most of my life in the network, the activity on the Internet can create and ruin a reputation, says lead author Dr. Matthew Caldwell (UCL Computer Science). — Unlike many traditional crimes, crimes in the digital environment are easy to distribute, repeat and even sell these techniques as promoting services."

In September, the hacker called up the CEO of an energy company from the UK, forging the voice of the head of another company with deepface, and demanded to transfer $243 thousand to a third party. The scammer did the same intonation and accent of the other person.


RELATED MATERIALS: Defence and security