Center for Strategic Assessment and forecasts

Autonomous non-profit organization

Home / Defence and security / / Analytics
The militarization of information
Material posted: Publication date: 19-07-2017
Dmitry Kiselev, Director General of Russian government-controlled media Agency "Russia Today", said: "objectivity is a myth which is proposed and imposed on us." Today, thanks to the Internet and social media, the manipulation of our perception of the world before the unimaginable happens in time, space and internationally. This is the source of one of the biggest vulnerabilities that we as individual citizens and society as a whole must learn to cope. Today, many actors are using these vulnerabilities. The situation is complicated by the increasingly rapid development of technologies for the production and dissemination of information. For example, over the past year we have seen the transition from the dominance of text and images to the recorded video, and now even displaced live video. Vulnerabilities evolve with the technology, but it reduces the cost of the technology that makes possible the existence of a larger number of actors.

A COMMON THREAT.

"Information operations also known as operations effects; include the collection of tactical information about the enemy, and spreading propaganda in order to achieve a competitive advantage over the opponent," - this definition is applicable in peaceful and military contexts. Traditional tools such as press, radio, TV, movies, now supplemented by the Internet and social media.

These technologies have led to qualitatively new environment of operations influences, beliefs and, more generally, of mass manipulation. Ability to influence effectively "Demokratsiya" from the moment any person or group obtained the ability to communicate and influence a large number of others online. Second, this environment is now much more quantifiable. Data can be used to measure the response of individuals and of crowds to influence them. Finally, the impact much more quickly. Users can be influenced by information provided by anonymous or even they are able to influence interface design. The Internet and social networks provide new ways of constructing reality for the actors, the masses and the media that challenges the traditional functions of the media.

Interaction in the information environment is rapidly evolving, old patterns become irrelevant faster than we can develop new. The result is uncertainty, which leaves us prone to dangerous conditions without adequate protection.

The information environment can be widely described, both in the technical and psychosocial dimensions. Today, the security of the information environment (often referred to as cyber security) is mainly concerned with purely technical aspects: protection against attacks such as "denial of service", botnets, massive theft of intellectual property and other attacks that typically use security vulnerabilities. This approach does not take into account a number of aspects: little attention was paid to protection from such incidents as the hacking of Associated Press Twitter account in April 2013, bringing a group of hackers gained access to account news agencies to publish the message "Two explosions in the White house, Barack Obama injured". The message, supported by the authority of the associated Press, caused a decline and recovery of the stock market on 136 billion. in a few minutes. In this attack was used as a technical (seizure of accounts) and psychosocial (understanding market reactions) features of the information environment.

Another attack using a purely psychosocial characteristics, occurred in India in September 2013. The incident began when a young Hindu girl complained to her family that she was insulted by a Muslim boy. According to reports, her brother and a cousin killed him, which caused clashes between the Hindu and Muslim communities. To incite violence, someone posted a horrible video showing two people identified as Hindus, were beaten to death. Through the social. network spread rumors that the mob killed the brother and cousin of the girl. To stop the violence it took 13 thousand Indian soldiers. Later it turned out that the information stated in the video, turned out to be false, but the video itself was shot not even in India. This attack does not require special skills, but only a proper understanding of time and place, which should be placed the video to achieve the desired effect.

These last two actions are examples of cognitive hacking. The key to the success of these cognitive hacks was the unprecedented rate and extent of spread of misinformation. Another key element of the success of these two actions was the correct assessment of the cognitive vulnerabilities of their intended audience - the premise that the audience already predisposed to the decision, because under the influence of fall existing fears or anxiety.

Another particularly instructive incident occurred during the operation "Walhalla" in Iraq in March 2006. A battalion of soldiers US special forces entered into battle with the battalion of death Jaish al-Mahdi, killing 16 or 17 people, seized 17 people and freeing the badly beaten hostage.

At that time, when the soldiers returned to their base – that took less than one hour, the soldiers Jaish al-Mahdi returned to the scene and moved the bodies of their fallen comrades so that they looked like they were killed during prayer. Then, they published photos and press releases in Arabic and English pointing to the atrocities committed by American soldiers.

It took almost three days before the U.S. military tried to tell his version of events to the media. The army was forced to launch an investigation, which lasted 30 days, during which the battalion remained inactive.

Operation Jaish al-Mahdi - a great example of how social media and the Internet can defeat physical strength. This incident was one of the first clear demonstrations of how opponents can openly monitor the reaction of the American audience for social media messages in real time thousands of kilometres and fine-tune their actions accordingly this reaction. The media and the Internet give our opponents unlimited global access to their intended audience, while the US government is paralyzed by the legal and political issues.

RUSSIAN THREAT.

In February 2017, the Russian defense Minister Sergei Shoigu openly declared the formation of the information forces in the Russian army, "were created by forces information operations, which is expected to be much more effective than anything we have used previously for the purposes of counter-propaganda." The current chief of General staff General Valery Gerasimov said that currently the war is at 80% is carried out using non-military means. In Russia's view, these non-military measures of war include economic sanctions, severance of diplomatic relations and political and diplomatic pressure. Russia considers information operations (IO) as an important part of non-military measures.

The Russian view of IO does not match the view of the West. For example, 12 Glossary of key terms information security, established the Military Academy of the General staff of the Russian Federation, contrasted with fundamental Russian and Western concepts of IO, explaining that the Russians IO is a continuous operation regardless of the state of relations with any government, while the West sees IO as limited, tactical action only in military terms. In other words, Russia considers herself in a constant state of information warfare, and the West - no.

Government propaganda and disinformation has existed as much as the state. The main difference in the twenty-first century lies in the simplicity, effectiveness and low cost of such operations. As the audience around the world relies on the Internet and social networks as main sources of news and information, they have become a popular vector of attack information. Russian IO techniques, tactics and procedures evolve constantly and quickly, because they are very cheap and more effective in the long term compared to other types of weapons.

At the moment the Russian operators use relatively unsophisticated methods systematically and on a large scale. This relative lack of sophistication allows them to be detected. For example, existing technology can identify the paid-for operation with trolls, bots, etc. Another key element of the strategy of the IO is to focus on groups with mutual contradictions, to create an atmosphere of mistrust in the EU and NAT. governments. They can also be detected. The current apparent lack of technical sophistication of the Russian methods of IO may be inferred from the fact that until now the Russian IO met minimal resistance. However, if and when the target will begin to resist these efforts and / or expose them on a large scale, Russian is likely to accelerate the improvement of its methods, which will lead to a cycle counter. In other words, most likely, will come the arms race in the information space.

STRATEGY TO COUNTER THE RUSSIAN THREAT.

Because the culture and history of each country is unique, the success of any communications strategy should be adapted to the local peculiarities of the country.

Strategy of information security to counter Russian offensive IO operations should be characterized as "strategy of a single nation". "The strategy of a single nation" is a coordinated effort between NGOs, the military, intelligence community, industry, media, research organizations, academia and organized groups of citizens.

As in the physical world, good maps are critical to any strategy. In the case of IO maps showing the flow of information. Information map should show the relationship in the information environment and to help navigate through this environment. They exist as software and databases.

Information mapping for IO is the art of creating, supporting and using such cards. An important feature of the information cards is that they are constantly changing to reflect the dynamic nature of the information environment. Because they are computer programs based on artificial intelligence, they can answer questions; to cover the situation; help to plan, monitor and modify operations. The information card is technically possible today, and already exist in forms that can be adapted to support, development and implementation of the strategy of IO.

For example, most of the member States of the North Atlantic Treaty Organization (NATO), as well as multiple partners, non-NATO, already affected by numerous Russian IO and illustrate the current Russian methods of IO. Using information mapping, you can display the key Russian sources in the Russian information operations. These sources may include:

  • And the target of the Russian analytical centers
  • funds (for example, "Russian world")
  • government agencies (e.g. Rossotrudnichestvo)
  • TV (e.g., RT)
  • pseudo-news agencies and media service (e.g. Satellite)
  • cross-border social and religious groups
  • social media and the Internet trolls attacking democratic values
  • controlled by the Russian Federation organization
  • Russian political parties and other organizations within the EU that undermine political cohesion
  • Russian propaganda is directly aimed at journalists, politicians, and individuals in target countries in particular and in the EU as a whole.

Similarly, the mapping of the objectives of the Russian information operations may include:

  • national governmental organizations
  • military
  • intelligence
  • industry
  • Media
  • independent analytical centers
  • academia
  • organized groups of citizens.

Effective information and defensive strategy will be based on the coordinated counter the information flows identified on the information maps. An effective strategy will include the establishment of trust between the defence forces and the public. The strategy will also include mechanisms for identifying the ever-changing nature of the threat the Russian IO and the early adaptation of all elements of protection.

Christopher Paul and Miriam Matthews from the RAND Corporation noted: "Experimental studies in psychology show that modern Russian propaganda model can be very effective." They represent a thorough and succinct analysis of the results of relevant psychological research.

For example, they describe what features of propaganda can be used to distort the perception of reality:

  • People are poor judges of true or false information - and they don't necessarily remember that specific information was false.
  • Information overload encourages people to take shortcuts when determining the authenticity of the reports.
  • Familiar topics or posts can be attractive, even if they are false.
  • The statements are likely to be accepted if they are confirmed by the evidence even if the evidence is false.
  • Peripheral signals, such as the emergence of objectivity, can increase the credibility of the propaganda.

That might look like a typical offensive strategydirected against the target population. It consists of several stages:

  1. Divide the population into communities based on any number of criteria (for example, Hobbies, interests, policy, needs, problems, etc.).
  2. Identify who in each community are most susceptible to these types of messages.
  3. Define the social dynamics of communication and flow of ideas in each community.
  4. Determine what archetypes dominate the conversation in each community.
  5. Use all of the above to develop and launch an ideological concept.
  6. Apply continuous monitoring and interaction to determine the success of your efforts and adjustments in real time.

Currently there are technologies that allow you to perform each of these steps continuously and in large scale.

PROSPECTS.

Cognitive security (COGSEC) is an emerging area that focuses on this expanding frontier, suggesting that future researchers, government, social platforms and private entities will participate in the ongoing "arms race" to influence it and to protect from influential User groups online. Despite the fact that cognitive security comes from social engineering and the manipulation of implements in the area of computer security, it differs in many important aspects.

First, while the focus of the "computer security" is paid to the influence of several persons, COGSEC focuses on the use of "cognitive biases" of large social groups.

Secondly, while "computer security" focuses on deception as a means of compromising computer systems, COGSEC focuses on social influence.

Necessary the Center for cognitive security to create and use tools necessary to locate and support the fundamental models of our ever-changing information environment and protect us in this environment as individuals and collectively. The center will bring together experts working in such fields as cognitive science, computer science, engineering, social science, security, marketing, political propaganda and psychology to develop theoretical and applied engineering methodology for the management of full spectrum "information security environment."

The center should be non-profit and located in a non-profit non-governmental organization, which has an international reputation and close links with government, industry, academia, think tanks and community groups around the world. It needs to have the following functions:

  1. Bring together experts for policy development, strategies and approaches to cognitive security.
  2. To create a clear and practical technological goals in support of the strategies.
  3. To identify and assess relevant commercial technologies.
  4. To identify and assess relevant research results, and develop and implement strategies for their transition to practical use.
  5. Work with end users from all communities to develop techniques, tactics and procedures for the use of technology identified and developed for policies and strategies.
  6. To create a research programme for developing policy and strategy, implementation and support of technology.
  7. Development of training and trenirovochny materials and holding seminars and conferences.
  8. Support group, which will coordinate the work with all communities to identify campaign influence and dissemination of warnings and alerts.

This centre should be fully funded for the first five years by the U.S. government as long as he is not able to establish additional sources of funding. The center also must have the authority and funding for grants and contracts, as well as to the group of core staff employed by the centre, many of the participants will work remotely.

Rand Waltzman


RELATED MATERIALS: Defence and security
Возрастное ограничение