CHAPTER
2
What Is Deception?
image
 
Deception is an old tactic that has been used for millennia by forces around the world and throughout history. In this chapter, you will learn about some of the traditional techniques and methods commonly used by military and corporate organizations to counter threats and adversaries. This chapter shows how deception can be used as a tool to lure or push your threats into areas of your enterprise that you have prepared for proactive responses or countermeasures. It makes heavy use of military-based deception techniques, concepts, and vernacular, as most forms of formal deception were derived from military operations and constructs throughout the years.
As you read through this chapter, you will see how deception has been used traditionally, and how the basic concepts and best practices can easily be applied to the cyber realm of advanced, organized, and persistent threats across your enterprise.
How Does Deception Fit in Countering Cyber Threats?
From the moment I picked your book up until I laid it down, I was convulsed with laughter. Someday I intend reading it.
—Groucho Marx
There is nothing more deceptive than an obvious fact.
—Arthur Conan Doyle
Simply put, deception is a technique whereby we mislead people into believing information that prompts them to behave in a way that is favorable to us, while at the same time protecting our true intentions and posture. Truth can be lies as easily as lies can be truth.
Deceiving people and computers requires interaction with the sensory components. Sensory components can be considered any avenue by which information can be detected or received. In humans, this typically includes auditory, visual, olfactory, and electronic. Other factors that should be taken into consideration include reason, consciousness, skill level, experience, and free choice. All of these avenues can be exploited when it comes to evading detection of a human analyst or an autonomous security system.
Resources (such as time, equipment, devices, personnel, and material) are always a consideration in crafting deception, as is the need to selectively hide the real and portray false information. Traditional military deception includes operational (manual/physical) techniques known as feints, demonstrations, ruses, displays, simulations, disguises, and portrayals.
Six Principles of Deception
Military Deception (MILDEC) is one of the foundations of Information Operations (aka Information Warfare). Six primary principles make up what we know as MILDEC today (from Joint Publication 3-13.4, Military Deception, “Executive Summary”):
imageFocus The deception must target the adversary decision maker capable of taking the desired action(s).
imageObjective The deception must cause an adversary to take (or not to take) specific actions, not just to believe certain things.
imageCentralized planning MILDEC operations should be centrally planned and directed in order to achieve unity of effort.
imageSecurity Friendly forces must deny knowledge of a force’s intent to deceive and the execution of that intent to adversaries.
imageTimeliness A deception operation requires careful timing and action.
imageIntegration Fully integrate each military deception with the operation that it is supporting.
Let’s take a closer look at each of these principles.
Focus
It is all about the adversary decision maker. In a deception, the person who makes the decisions, allocates resources, and approves strategic decision making is the person the deception should be tailored to affect. All others fall short because the ultimate purpose of the deception is to have the adversary allocate, waste, or improperly spend resources in a way more favorable to your efforts.
Focus can be used against an individual or group. When the focus is on the individual, it’s tailored to deceive that individual. When the focus is on a group (such as organized crime or a foreign government), it’s about the leadership of the group infiltrating your network—the frontline attackers report all of their findings up through a chain of command, and someone in that chain makes decisions based on the intelligence collected.
The importance of focus is directing the decision maker into making the wrong decisions or decisions of your design.
Objective
The goal is to get the adversaries to act or not act; you don’t want to just tell them a nice story.
Perhaps we project a story that there is an unopened bottle of high end Johnny Walker sitting on the bar and it is available for free—first come, first served. Say that we know the adversary decision maker is a connoisseur of Scotch whiskey and loves high end products. That is a nice story, but is it enough to get the adversary to go to the bar himself?
When relating the principle of objective to the cyber world, think about developing a project or system that may be of interest to a threat. You need to design a deception operation that will interest your adversaries and lead them to act and fall into your deception.
Centralized Planning and Control
Each deception should be coordinated and synchronized with all other deception plans to present a seamless story across the organization. Overlooking the smallest detail could prove fatal.
Perceptual consistency is one of the most important goals of deception, especially when dealing with a highly skilled and motivated threat. Consistency can be built into many areas from personnel, logistics, financial, and technical resources and assets.
The adversaries must see a seamless story that is compelling enough based on all the intelligence they have collected from your enterprise. Essentially, you want to make the threat feel comfortable enough to take an action against your deception. The slightest innocuous detail can ruin an entire deception operation. For example, if John is listed as a member of a team that is associated with the deception, and John is transferred to another location but is still listed in the deception as being at his original location, the adversary will more than likely come across the discrepancy and not act, which defeats all of your efforts and resources to build the deception.
Security
One seller of high end Johnny Walker will not tell you that next door there is a sale on the exact same product. In the same vein, why would you want to deliberately give away information regarding the true deception story? The mere fact there is a deception should foster a heightened level of security.
Operations security (OPSEC) of your deception is critical to ensuring success and the ability to continue to your end goals. Securing your deception is of the utmost importance. A slight error or oversight can breach the security of your deception against a threat.
Timeliness
If we cannot get across the message to our adversary that the high end Johnny Walker is all teed up and ready to go, which would prompt him to take action, our efforts are lost in the blink of an eye. Having him show up the following day or week may do us no good at all, and may actually be detrimental to ongoing efforts.
Always have the right message delivered at the right time to the right person through the right conduit. When building a deception plan, you need to ensure each portion of the initiative is released on time and is consistent with every other component or operation of your organization.
For example, if you allow a threat to steal information surrounding a deception regarding a new system that is being built, and specific details on the network location of this system are embedded in the content of the stolen data, the threat may move quickly and act on the latest intelligence. So if your bait systems are not set up properly (or not at all), you have failed in your objectives.
The following shows an example of a simple format planners can use to track and schedule the actions required to make the deception successful.
image
Joint Publication 3-13.4, Military Deception
Integration
No effort should ever be a stand-alone effort; it should be fully synchronized and integrated into ongoing factual efforts, as well as other deceptive efforts.
When building a deception plan, you need to factor in production or operational resources that will need to be leveraged to ensure perceptual consistency to the threat. If you stage bait systems that simply sit in a corner of your network, without any context or activity, they will serve no purpose. Threats may understand you are watching and waiting, and could alter their patterns or behavior.
In order for deception to truly be effective, you should incorporate a coordinated effort and leave trails and traces of the deception across your organization and even partner or subcontractor networks. This builds on perceptual consistency to an attacker. A good example would be your chief executive officer (CEO) sending out an organization-wide e-mail stating that a new initiative is being kicked off and announcing the program and specific individuals who will be working on this project. This would require your finance, administration, operations, and human resources departments to incorporate portions of data within their groups in order to provide the perceptual consistency to the threat. If threats do not feel comfortable, they will not act.
Traditional Deception
This section presents several historical uses of deception as they apply to military combat or operations, as well as how each applies to the cyber realm. One of the most important things to realize is that the tactics do apply to cyber attacks. As the defending force of your enterprise, you should do the most you can to minimize the risk to your enterprise by leveraging these examples and applying them to your organization‘s security policy.
Feints—Cowpens
For those of us who were around in 1781, there is a lesson here about feints that could involve a shot of scotch to prep the objective. General Nathaniel Greene, commander of the Southern Department of the Continental Army, appointed Brigadier General (BG) Daniel Morgan to take up position near Catawba, South Carolina, between the Broad and Pacolet Rivers. BG Morgan knew his adversary, Lieutenant Colonel (LTC) Tarleton, well. Additionally, he knew the readiness level of his regulars and militia, as well as his adversary’s perceptions of his force’s condition. In a previous engagement, the militia had barely stood their ground in the face of the hardened British regulars, and BG Morgan knew he could use this perception to draw LTC Tarleton and the British Legion into a trap. BG Morgan placed his troops in three rows: the first was the sharpshooters, the second was the militia, and the final row was the Continental Army regulars.
image
The plan was to feign LTC Tarleton and lure him in through a series of controlled engagements. Two bouts of engagement and withdrawal would deceive the British Legion members into believing they were in control of the battlefield. Sadly for LTC Tarleton, there was no recovery from the route, and there was a great Continental Army victory at the pivotal battle known as the Battle of Cowpens.
Applying to Cyber
When it comes to boundary, perimeter, and internal systems and applications, you can feint almost anywhere. You can control the terrain better than your threat, as you should always have control of your physical systems (even if they are located at a remote site). A feint in the cyber realm would be placing weak systems at various high-value target nodes or points of entry into your enterprise. This could provide threats with a false sense of confidence and security, which could lead to mistakes on their end.
Demonstrations—Dorchester Heights
Now let’s take a look at the Siege of Boston in 1775 and how a demonstration secured yet another victory for the colonials.
In April 1775, militia men surrounded the British garrison in Boston and kept the soldiers at bay for 11 months. It was the opening stage of the Revolutionary War, and General Washington knew the artillery recently captured by Nathan Hale at Fort Ticonderoga would be just what he needed if it was placed in the right location: Dorchester Heights. This area was recognized by military commanders as key terrain because of its commanding position over Boston. From there, the captured artillery could reach not only the city, but as far as the harbor. Strategically located, this position could not be reached and was never threatened by British artillery.
General Washington sent a young Colonel Henry Knox to Ticonderoga to collect all the captured British artillery and bring it back to Boston. In March 1776, Colonel Knox returned and placed the artillery in plain view of the British forces. The Continentals worked throughout the night and in the morning, and the British forces had quite a surprise. Just the presence of the guns led the British commander, General Howe, to take action. In a short time, General Howe evacuated his forces, never to return to Boston.
General Washington didn’t need to do anymore than show the guns in a demonstration to gain the tactical advantage. He actually didn’t have the munitions to fire most of the guns—a minor technicality, one would say.
Applying to Cyber
Demonstrations of superiority are common, as governments around the world show off their latest and greatest weapons and technologies. Every year, reports are leaked of some intrusion that could be state-sponsored and/or organized by a foreign government. This indicates that there is knowledge of the events and that they are being monitored. Whether or not action is taken against the threat operating abroad (which is seldom), that is a demonstration of force to protect assets.
Ruses—Operation Mincemeat (the Unlikely Story of Glyndwr Michael)
Glyndwr Michael was a vagabond and a Welsh laborer who perished after ingesting rat poison in a London warehouse. His life was quite unremarkable until after his death. That was when he assumed the persona of a soldier named “Major Martin.” The remaking of Glyndwr Michael was to become part of one of the most elaborate ruses in military history.
It was 1943, and the Allies were looking to make their way into Sicily, but with strong German and Italian fortifications and troop density, the task seemed a bit overwhelming. British Naval Intelligence was up to the task of creating a deception to help the troops in the field. They were favored with the insight of a brilliant staff officer, Ian Fleming, who devised the plan after remembering an old spy novel he had read years before.
Major Martin was equipped with false plans and papers, and left with a life preserver off the shore of Spain. As hoped by the Allies, Major Martin and his papers made their way to the German high command. After verifying the plans were authentic, the Germans were certain that the Allies were going to assault through Greece and not Sicily, as previously thought. Hitler moved a division out of Italy, and the Allies attacked with little to no resistance.
Applying to Cyber
In your deception, if you proclaim to your personnel across your organization that you are moving critical systems to a specific location, this could be considered a ruse to lure the threat to that location. This type of deception can also be seen in content staging, which is the generation of false information, documentation, or actual operational information that has been modified and duplicated across your enterprise. This makes it difficult for the ones who are stealing your information to identify which data source has the actual information they are seeking.
Displays—A Big Hack Attack
As the dawn breaks across the western Atlantic Ocean, two F-25 Virtual Attack Fighters (VAF) take off from Seymour Johnson Air Force Base. Captains Bjork Williams and Robert Oehlke find themselves flying another standard combat air patrol along the East Coast. Recently, terrorist groups have acquired late 20th century U.S. Navy Aegis cruisers and have been conducting raids upon the new Border states of the Virgin Islands and Bermuda. As the two aircraft make their way out to sea, clouds begin to roll in and the ocean surface is quickly obscured. Approximately twenty minutes into the mission, a surface vessel is picked up on the multi-spectral imaging and sensor system aboard the F-25s. Even though the target is identified as a fifty-foot catamaran, the two pilots decide to buzz by and take a look. As they break through the clouds the pilots realize something is drastically wrong. Three Aegis cruisers appear before their eyes, while their computers still show only a small watercraft. Hackers aboard the cruisers tapped into the F-25 imaging system and altered the information processed within the systems. Before the pilots can react to the trap, their aircraft are shot by a short-range electro-magnetic pulse weapon, and fall powerless into the sea. Captains Williams and Oehlke have just become victims of Virtual Deception.
—Lt York W. Pasanen, “The Implications of Virtual Deception,” Air & Space Power Chronicles (April 1999)
In the everyday walk of life, deception is one of those taboo things that people usually frown upon. People do not like to be deceived and usually resent the deceiver. “Honesty is the best policy,” so we are told in our youthful years by parents and role models alike. What is it like when we are the ones who deceive? How do we feel about it when we deceive? Is turnabout truly fair play? Do we justify it by saying, “He doesn’t need to know” or “She will find out about it later”? Perhaps we withhold important information in a passive deceptive manner for a number of reasons, but is it okay when we do it to others after they do it to us (with the same justification that it is not right)? Don’t we all do it from time to time?
Applying to Cyber
What your threat knows about you is one of the most important parts of your organization you want to protect. That is why most security teams talk about OPSEC, which is the active security of your organization’s operations and information. What you display to your threat is critical, especially when it comes to deception planning. You do not want your threats to identify your deception or your ability to implement a deception that could hinder their objectives.
Another example is the exploitation of the output of your threats’ tools. If you can display incorrect information to their reconnaissance tools, network scanners, listeners, and so on, you could lead them in the direction of your choosing.
Deception is a powerful tool, especially when you enter the cyber world, where deception is much easier to pull off because, generally, your threats are entering your network from a remote location and without physical access to your organization.
How many used car salesmen get a bad rap because they intentionally misrepresent the bottom line? But wait, aren’t they just attempting to offer a starting point for negotiations? Should they be vilified for proven marketing techniques, and aren’t deceptions just advertising on steroids? A good used car salesman will be a student of marketing and human nature.
What is the difference between unethical and ethical advertising? Unethical advertising uses falsehoods to deceive the public; ethical advertising uses truth to deceive the public.
—Vilhjalmur Stefansson
Mankind was my business.
—Ghost of Jacob Marley in A Christmas Carol
The art of persuasion is very important when dealing with an organized threat, as you must earnestly go out of your way to allude to and misrepresent your security posture in a way that will ensure your threat takes the bait. You must not only understand people, but also empathize with their situation and anticipate their next thoughts. Such insight is sought and eludes all but the most astute.
You should take into consideration where your threat currently is within your enterprise and understand which systems might be targeted next. Being very familiar with standard operating procedures for a precision attack and exploitation methodology is important. You want to understand what your threats are after and know how they might move through your network.
Knowing response methodologies in some cases as an attacker’s methodology may contribute to identification of future targets and objectives. Great pains go into decrypting and reverse engineering the thought process and decision points of the targeted decision maker. Regardless if this is a military or civilian activity, the more that is understood about the target, the better the picture is of what their intentions are.
The military deception staff and the advertising staff have both developed plans to present a message to their audiences. The military target is the center of gravity (COG), and the advertiser target is the decision maker who will purchase the car. Strangely enough, the COG in a military deception is the adversarial decision maker whom the deception staff is attempting to influence. Can the message get to the decision makers and will it be understood? Both groups spend much of the planning process ensuring that these goals are achieved.
When planning a deception against a skilled target, you should take into consideration the lengths the target will go to detect a deception. You will need to understand the strengths and weaknesses of your deception in order to make it as perceptually consistent as possible, especially when operating across a large enterprise or organization.
Some counterfeits reproduce so very well the truth that it would be a flaw of judgment not to be deceived by them.
—Francois de La Rochefoucauld
Finally, the payoff question: Will the message be acted upon? A lot of time, resources, and effort are put in motion to ensure that this happens. It is the payoff moment for the planners. Will the auto dealer receive revenue and move products? Will the military deception planners get the COG to act in a way that is more favorable to friendly forces? Will your threat take action and perform according to your COG planning?
What are we selling? Is the deception planner giving information to the adversary with malice in mind? Is the advertiser presenting something the focus (the decision maker capable of taking the desired action) really needs or wants? In deception, the planner usually displays something the COG expects according to his personal bias and beliefs.
Never attempt to persuade people until you have listened to them first.
—Morey Stettner, The Art of Winning Conversation
In order to know what people expect, you need to be a quiet, patient student of them. There are various means to gain a better understanding of the individual you seek to persuade, usually either by observation or listening. Much can be said if you have gleaned enough information about someone to identify where he will go next or what car he will purchase. Such insight is the goal of everyone from advertising executives to intelligence professionals. There is a thought process of interpreting the nature of the focus, who is always a human, and having a better understanding what the objectives and motivations of the focus may be. Even simply monitoring events while you respond to the adversary or focus may help you better understand what he is after.
The people who can predict behavior will dominate whatever discipline they apply themselves to. In this way, advertisers and deception specialists also play on the personal biases and wants of an individual to get that individual to purchase their product. Is the car really necessary? Most people probably do not need a new car, but will purchase one for a variety of reasons, such as the lure of a new automobile, a desire to “keep up with the Joneses,” a feeling of personal obligation after befriending the salesman at the dealership, and the belief that they can afford it. In the same way, the personal bias of the focus looms as an overbearing emotional anchor, which helps sell the deception. It is important to understand the objective of the focus. Is it espionage, information, financial theft, or long-term persistent remote control of an enterprise network?
The target in deception is the single focus of the COG, while an advertising campaign tends to shotgun out a message to a much broader distribution. This technique might imply that advertising campaigns more closely resemble psychological operations (PSYOPS); however, there is one notable distinction. It’s true that the advertising campaign is widely broadcast, but the intended audience is the same as with a deception: the COG or decision maker in a specific organization. This is why advertising sometimes seems more closely aligned with deception theory than the PSYOPS technique.
Deception is sometimes difficult to pin down, which usually means that it was effective. If Alice told Bob something that was deceptive, and Bob acted on that information the way Alice wanted him to, then her deception was effective—game over.
Why Use Deception?
During the Zhou Dynasty in the Warring States Period (475–221 BC), there was a prominent general named Sun Tzu who served the government of China. These times were very trying and violent because China was divided. Sun Tzu’s strategic philosophy was succinct: “If you know your enemy and know yourself, you will not be defeated in a hundred battles.” Sun Tzu believed that knowledge of the truth, and the wisdom derived from it, was the bedrock of victory. In The Art of War Sun Tzu wrote that the most powerful weapon of warfare was information, and he knew that if he managed what information his adversary was able to obtain, he could manipulate his adversary’s actions.
Anything worth having is a thing worth cheating for.
—W. C. Fields
Sun Tzu surmised that, “All warfare is based on deception. Hence, when able to attack, we must seem unable; when using our forces, we must seem inactive; when we are near, we must make the enemy believe that we are away; when far away, we must make him believe we are near. Hold out baits to entice the enemy. Feign disorder and crush him.”
Sun Tzu knew that by establishing and managing the complete information environment early on, he would essentially control and subsequently own the decisions of his adversaries without actually engaging in armed conflict. “Thus it is that in war the victorious strategist only seeks battle after the victory has been won, whereas he who is destined to defeat first fights and afterwards looks for victory,” He continued, “Supreme excellence (in strategy) consists in breaking the enemy’s resistance without fighting.” There can be no argument that if he could get his adversaries to forfeit without one single armed engagement, he could save an immeasurable amount of resources, including warriors and equipment that could be used for his next campaign. Attrition is the enemy of success, and no (or limited) losses make an army that much stronger for its next engagement.
Historically, we can see the importance of deception to the Chinese, but what about contemporary practice? Mao Tse-tung concluded, “To achieve victory we must as far as possible make the enemy blind and deaf by sealing his eyes and ears, and drive his commanders to distraction by creating confusion in their minds.” This narrative can leave no doubt as to the Chinese Information Warfare doctrine. For many years, the Chinese people have lived Information Warfare, and the United States has struggled to grasp its importance, while all the time not understanding the Chinese expertise in this area.
Even more recently, Major General Wang Pufeng, a former Director of the Strategy Department, Academy of Military Science, Beijing, wrote (in “The Challenge of Information Warfare,” China Military Science), “Counter reconnaissance [is necessary] to prevent the opponent from obtaining information about the true situation. For example, secret falsification can be used to plant false intelligence and false targets in the place of true intelligence and true targets to confuse the real and the false and muddle the opponent’s perceptions and inspire false assessments. When conditions exist, active methods may be used to engage in interference to blind or even destroy the opponent’s reconnaissance instruments.”
His message here is clear: every path to every goal along the journey will have numerous paths of deceit embedded within. Truth will be protected by lies and lies by truth. The web must be tight, complete, and totally controlled.
A second level of complexities emerges here where this deception operates and exists in solitude, but woven by the master weaver who sits at a loom to create a blanket. Each deception is so simple it could stand on its own, but like a flower on the side of a mountain, it would be out of place if not for the vibrant community of flowers that cover the face of the mountain. So it is with the historical populace of the Chinese deception strategy within their Information Operations doctrine.
It is discouraging how many people are shocked by honesty and how few by deceit.
—Noel Coward
Even more recently, there were reports of the People’s Liberation Army (PLA) advancing their cyber-deception capabilities through a coordinated computer network attack and electronic warfare integrated exercise. The Chinese strategy is to conduct an active offense to achieve electromagnetic dominance. In light of their ongoing research and development efforts, it is extremely likely that the Chinese will continue to bolster their Information Operations posture.
Just within the past few years, it has been noted that the PLA has organized and staffed cyber-attack units with civilian computer and IT experts, as well as militia and active military forces. Although there is debate as to the staffing level and resources dedicated to their efforts, there is no doubt as to the dedication and intentions outlined in the Chinese doctrine (as noted by LTC Timothy L. Thomas, in “China’s Electronic Strategy,” Military Review, May–June 2001).
Now we know why the Chinese have successfully used deception for more than 2,000 years and continue to use it today. Over the centuries, numerous countries have employed military deception in some form in order to achieve specific goals. You’ll notice that the Chinese are heavily referenced here, due to their extreme confidence and success in executing deception in both war and their society through the years.
One of the most important things to think about is why we continue to use deception. What is the benefit, if any? Deception is not the end-all solution to executing the perfect plan, but it may help you defend against an active persistent threat within your enterprise. Any edge in an engagement during combat operations could prove immeasurably valuable and ultimately swing a battle one way or another. Deception, although sometimes very complicated, can be employed for relatively little cost; it’s basically an economic decision.
In the following sections, we’ll take a look at how the use of deception has provided benefits in several situations.
The First US Army Group Deception
During World War II, both the Axis and Allies used deception extensively. Neither re-created deception doctrine; both drew on historical lessons from Sun Tzu and his polar opposite, Carl von Clausewitz, as well as other existing doctrine to achieve their goals. Sun Tzu believed deliberate deception planning led to a commander’s success. Clausewitz believed the mere activities of battle led to sufficient confusion, thereby allowing for success. In that, Sun Tzu believed that information superiority was the key to success, Clausewitz took a dialectical approach to military analysis such that he would explore numerous options before drawing a conclusion, which has led to many misinterpretations of Clausewitz’s magnum opus, On War. Clausewitz believed that in the “fog of war,” information (be it accurate, incomplete, or even dubious) was subject to the human factors of disillusionment, confusion, doubt, and even uncertainty; the essential unpredictability of war was preeminent. With either Sun Tzu’s or Clausewitz’s philosophy, there were numerous opportunities for the belligerents to inject strong and compelling deceptive stories.
A myriad of techniques and tactics were employed at different times in the war. Inflatable vessels and vehicles were some of the props that were used by the First US Army Group (FUSAG), a fictitious group that was activated in London in 1943. Although the inflatable were of little value, due to the limited ability of the Nazis to reconnoiter (observe and assess a specific area prior to military encampment) the staging area, the other aspects of the FUSAG deception played a big role in selling the activity to Erwin Rommel and Adolf Hitler.
Truth is often the favorite tool of those who deceive.
—Bryant H. McGill
FUSAG was an Allies invention designed to deceive Hitler about the details of the invasion of France. FUSAG was activated to participate in Operation Quicksilver, which was the deception plan attached to the D-Day invasion. The Allies needed to convince Hitler that the actual assault to establish the Western Front would be at the Pas-de-Calais, approximately 300 kilometers away. The Allies knew that if the plan was to be successful, the German decision makers must be deceived to act in a way that would be favorable to the Allies.
The need was so compelling that the deception planners petitioned for the placement of George S. Patton in command to display to the Germans that the highest level of interest was in the success of this unit. General Dwight Eisenhower agreed with the planners and assigned General Patton as the commander (much to Patton’s dismay, since he believed he should be leading the assault and not playing along as the commander of a notional unit). The size of FUSAG, both in manning and equipment, was at a deficit, which mandated creativity.
Hitler and Rommel both believed that the invasion of France was coming at the Pas-de-Calais, so they refrained from dedicating the reserve of Panzers to Normandy. This was critical in allowing the Allies to establish an initial beachhead (reach the beach and begin to defend that ground to advance from that initial landing point), and proceed with follow-on operations.
Applying to Cyber
You need to consider that portions of your network are currently compromised and being held by an external hostile entity. Whether the attack is targeted or opportunistic, establishing a safe portion of your network to serve as your beachhead can allow you to begin to engage your threat without your threat being able to reconnoiter that initial staging area of defense.
So, from a military standpoint, we see how deception can enable operations, but what about in other areas? Almost every human at some point has been involved in some level of deception. This natural ability can be applied to the cyber world as well.
Do you think you are an honest person? Do you play fair? You drive the speed limit, always tell the truth, and obey all stop signs. And if you’re caught stealing a base, you escort yourself off the field in the event of a close call. You are a good, fair person, but has anyone ever said that you have a “poker face”? You just engaged in a form of deception. Now is that fair? Is that nice? No one at the card table knows if you are telling the truth or lying. Lying can help you win. If the ends justify the means, you might be able to explain that this is the right thing to do. The bottom line is that you do not want anyone else at the poker table to know what you have because secrecy and deception give you the tactical advantage you need to win. Of course, everyone else at the table is doing it, too, because they want to win. You will need to figure out who is bluffing and who is not bluffing if you are going to be successful.
The systems within your enterprise are similar conceptually, as each has its own form of a poker face when infected by an active threat. You won’t know which was infected until it shows its hand by signaling or transmitting outside the enterprise. Keep in mind that deception is an important tool that is used by both sides: the attacker and the defender.
Russian Maskirovka
The use of deception is not independent from other governmental and cultural norms. A government’s use of deception in operations reflects not just the climate of that government, but also goes much deeper and embodies its culture. Political philosophy and practice can also influence whether a nation employs deception.
We have taken a brief look at the Chinese use of deception, but countries all over the world employ some level of deception. Consider how the Russians use deception to achieve their goals. Historically, we know that every subject taught in Soviet schools was infused with the communist ideology of Lenin, Marx, and Engels. Since the fall of the Soviet Union in 1991, there are some indicators that things have changed with the new Russian Federation. However, it was the corruption that ensued from the failed (or incomplete) communist dictatorship state that gave birth to a military marshal law state, which always has elements of control, deception, and corruption.
Maskirovka is the Russian word for a collective of techniques regarding deception. Historically, it is translated to mean concealment or camouflage. The interesting point here is that in a battlefield scenario, concealment protects only from observation. It does nothing to protect against actual gunshots, tanks, mortar rounds, or rockets. If the camouflage does not work, the consequences could be quite disastrous. The same is true for nonkinetic activities, such as the war of words when countries or organizations knowingly deny involvement in a specific act or series of events. For example, consider Iran’s defense of its nuclear energy efforts, only to have a government site write about a soon-to-come nuclear bomb. There are examples of countries all over the world participating in cyber espionage and not acknowledging the acts in and of themselves. The words that come from the leaders are one form of deception, which can be easily identified as such when the cyber espionage is detected and attributed to the origin or source (the country who swore it had no part in cyber espionage).
I have discovered the art of deceiving diplomats. I tell them the truth and they never believe me.
—Camillo di Cavour
Deception Maxims
Contrary to popular theory, deception maxims are not derived by the military intelligence community, but are a joint development effort from the operational elements and intelligence organizations from both military and nonmilitary organizations. Maxims are conceived from psychology, game theory, social science, historical evidence, and decision analysis theory. There are ten deception maxims that are used by the military.
In the DoD context, it must be assumed that any enemy is well versed in DoD doctrine. This means that anything too far from normal operations will be suspected of being a deception even if it is not. This points to the need to vary normal operations, keep deceptions within the bounds of normal operations, and exploit enemy misconceptions about doctrine. Successful deceptions are planned from the perspective of the targets.
Field Manual 90-02: Battlefield Deception, 1998
Understanding that the adversary is expecting deception and knows the doctrine of how it is employed is of paramount importance in developing and executing a successful deception campaign. All deception planning must be developed with these parameters in mind so that the executions are not something that will be obviously out of place and a dead giveaway.
The following sections discuss the ten military maxims, as presented in Joint Publication 3-13.4, Military Description, Appendix A.
“Magruder’s Principle”—Exploitation of a COG’s Perception or Bias
People believe what they believe. During Operation Desert Storm, Sadam Hussein believed that there would be an amphibious landing to start the invasion of Iraq. He believed the attempt to run cross country in the blitzkrieg style was suicide because his defenses were well supplied and staggered throughout the desert, making for a solid wall of resistance. General Norman Schwarzkopf used his personal experience and knowledge of tactics to deceive his adversary, who had oriented his forces to defend against the amphibious invasion that never came. Basically, it is easier to persuade COGs to maintain their preexisting belief than to deceive them by changing their belief.
Germans saw Hitler as an Aryan leader, not as an Austrian, in the same way that many in Iran see Mahmoud Ahmadinejad as a Muslim leader and not, as he was recently revealed to be, of Jewish heritage (per an article in the UK Telegraph by Damien McElroy and Ahmad Vahdat, published in 2009).
“Limitations to Human Information Processing”
Cognitive psychology is the study of internal mental processes, and is a key factor in understanding and explaining the two limitations to human information processing that are exploitable by deceptive techniques. Many in the deception business refer to these techniques as conditioning of the COG.
The “law of small numbers” is rather self-explanatory. It is very effective and very simple. The premise is that when one is presented with a small set of data, conclusions should be reserved. A single incident or two is no basis for building conclusions, and a decision should be delayed if at all possible. Additionally, a statistically significant data set requires a larger sample.
The second limitation of human information processing is fixed on the susceptibility to conditioning (the cumulative effect of incremental small changes). Small changes over time are less noticeable than an immediate, large-scale change. A man who lives on the side of a hill doesn’t notice how his land erodes from wash-off year to year, but take a snapshot and look at it at 25-year intervals, and he will notice something amazing: the land has receded, and many cubic yards have been removed from his yard!
We all know the story of the boy who cried wolf. In short, he called out false alarms to the village people so many times that they turned a deaf ear to him. The one time he really needed some backup, they ignored him because he had abused their trust so many times, and let’s just say things did not work out so well for the boy. His constant beckoning (stimuli) with no threat changed the status quo and adjusted the baseline of the data the villagers received. He created a whole new paradigm because his cries were perceived to be innocuous, and the alarm of “wolf!” was invalidated. What would he yell if he had encountered an actual wolf? How would they know? They did not much think that out, but they knew that when this boy cried “wolf,” it was no cause for alarm. The wolf then used this opportunity to attack because “wolf!” was invalidated.
“Multiple Forms of Surprise”
The US Army has an acronym for everything. One that is used for reconnaissance reporting is very effective in conveying the “Multiple Forms of Surprise” maxim: SALUTE. These letters stand for the following elements:
imageSize   How many people?
imageActivity   What are they doing? How are they doing it?
imageLocation   Where are they?
imageUnit/Uniform   What are the markings and distinctive unit insignia? What are they wearing?
imageTime   When did you observe and for how long?
imageEquipment   What do they have: rifles, tanks, construction equipment…?
By remembering this acronym, a complete picture of the unit can be formed. What if the adversaries saw a different presentation from the same unit every time their scouts reported the activity? Would they be able to picture the true composition and intent of the unit? The more variance in these items, the better the chance of achieving a comprehensive deception.
“Jones’ Dilemma”
Deception is more difficult when uncontrolled information avenues and sources are available to the adversary’s COG, because it allows the adversary to access factual information and get a picture of the actual situation. When you control these avenues, you can also control the content, amount, and frequency of that data as needed, and the picture of the situation that you paint will become your adversary’s picture as well.
“Choice of Types of Deception“
Ambiguity-decreasing deceptions against adversaries are employed to reinforce the story and make the adversaries very certain, doubtless, and absolutely wrong in their conclusions. Ambiguity-increasing deceptions accomplish the opposite, making adversaries increasingly more doubtful and confused or uncertain by clouding their situational awareness.
“Husbanding of Deception Assets”
First appearance deceives many.
—Ovid
Sometimes it is necessary to withhold the use of deception. There are situations where the delayed employment would have a greater effect and a broader range of success. An example in a cyber situation is where there has been scanning of systems (adversarial surveillance and reconnaissance), but no deception was employed. Now the adversary has a picture of what he is looking for and will return to exploit his success. This time, however, a deception technique is employed, and the adversary is caught off guard and loses the initiative. The defenders have withheld the deception to a greater advantage because they have fought at the time and place of their choosing.
“Sequencing Rule”
Deception activities should be put in a logical sequence and played out over a long period of time to maximize their effects and protect the true mission for as long as possible. A successful strategy will employ less risky elements of the deception early in the sequence, while holding and executing the more volatile ones later. As more risky elements are executed through displays, feints, and other methods, the deception planner will assess if the deception has been discovered, and in that case, it can be terminated. This leads to the next and all-important maxim.
“Importance of Feedback”
An Intelligence, Surveillance, and Reconnaissance (ISR) plan must be developed to obtain feedback. This is of the utmost importance and cannot be minimized under any circumstances. The chance of success of a deception is directly dependent on accurate and timely feedback to determine whether the deception is effective, and if countermeasures have been employed by the adversary to include the employment of counterdeception. Feedback allows for the freedom of movement to outmaneuver adversaries by staying one step ahead because you will be aware of their movements and intentions well in advance of any actual activities.
“Beware of Possible Unwanted Reactions”
Sometimes, a deception operation may spur an undesirable action from the COG, which could lead to undesirable actions by friendly forces. At times, all synchronized parts of a deception are played out in a perfect manner. However, the adversary may see the deception story and take actions that were not expected, catching friendly forces off guard, since the average soldier or leader on the ground had no idea there was ever a deception operation. The sensitivity of deceptions is such that they are highly compartmented, and access to them is guarded by strict “need-to-know” limits.
A second, and underanalyzed, troubling situation occurs when the planner assesses the consequences of success. These unwanted reactions are the result of a deception that causes the COG to take the actions we expect. The problem is that the actions we desire do not yield the results for which we initially assessed and planned. This could be a strategic and colossal blunder that actually inhibits the operation and has a negative impact on mission success. There is no recovery from this type of result, because the deception is already played out, and nothing remains to save the deception mission.
Careful planning and prudent analysis are paramount during the Course of Action development and war gaming to vet each possible response to the deception operation in order to minimize the consequences of success.
“Care in the Design of Planned Placement of Deceptive Material”
Folks, let’s not make it too obvious that we are conducting a deception operation. When there is a windfall of information, there is intense scrutiny of its validity. Security violations happen in some of the most unlikely environments, including those with ongoing and active security countermeasures.
When the adversaries work for something, they tend to have a higher confidence level in the information and believe it more. Overt activities are frowned upon by deception planners, and effort is given to ensure the adversary does not become suspicious.
Life is the art of being well deceived; and in order that the deception may succeed it must be habitual and uninterrupted.
—William Hazlitt
Understanding the Information Picture
Situational awareness and perspective are key to success.
Well, first I was gonna pop this guy hanging from the street light, and I realized, y’know, he’s just working out. I mean, how would I feel if somebody come runnin’ in the gym and bust me in my ass while I’m on the treadmill? Then I saw this snarling beast guy, and I noticed he had a tissue in his hand, and I’m realizing, y’know, he’s not snarling, he’s sneezing. Y’know, ain’t no real threat there. Then I saw little Tiffany. I’m thinking, y’know, eight-year-old white girl, middle of the ghetto, bunch of monsters, this time of night with quantum physics books? She’s about to start some shit, Zed. She’s about eight years old, those books are way too advanced for her. If you ask me, I’d say she’s up to something. And to be honest, I’d appreciate it if you eased up off my back about it. Or do I owe her an apology?
—Will Smith as Agent J in Men in Black
So, here’s the million-dollar question: Is the glass half empty or half full?
image
This is a valid question and requires serious contemplation. The following sections provide several versions of the answer; of course, your results may vary.
Half-Empty Version
Some people will say the glass is half empty. This is usually considered a pessimistic perspective. Is it not obvious that it had been a full glass and now half is gone, therefore leaving a shell of a full glass? It is a glass with only half of its original liquid representation. Is the amount of liquid sufficient for that container?
Half-Full Version
The optimists of society will say the glass is half full. There is always a bright side to having something instead of nothing. Besides, we can always add a bit more, and voila, there is a full glass once again!
image
Well, that looks good on paper, but is that really the whole situation? Is that the complete physical state of the situation? Has everything been captured in the half-empty or half-full statements, which are not as dissimilar as they might appear on the surface? Did you ever think you missed something? After all, everyone says it is half empty or half full.
A Question of Bias
We’ve been going about this all wrong. This Mr. Stay Puft okay! He’s a sailor, he’s in New York; we get this guy laid, we won’t have any trouble!
—Bill Murray as Dr. Peter Venkman in Ghostbusters
Now is the time to cast off all traditional thinking on how to solve a problem. Everything is not black and white in real life, but in the world of cyber, which is ones and zeros, there are fewer shades of gray—either your system or enterprise is compromised or it is not.
Deceit is a powerful tool, but consideration must be given to employment. For example, if you want to get all the people to buy cars from your dealership, you need to understand the target first. As with advertising, you need to study potential clients and their habits. Do not fall victim to the thought that because you like it, they will like it, too. Do not offer only subcompact economy cars to families in the heartland of America in the middle of hundreds of acres of farmland. That won’t work. Try something conducive to farm life. Likewise, don’t open a mountain skiing store in Miami, or you will find that business is not promising.
The most important part of any deception planning is discovering what the focus is after. Understanding this makes it much easier to engage the focus successfully while using deception.
Consider this question: Is it possible that the half-empty glass is full only to a point because it has a small hole, which means it cannot retain any more fluid? Subsequent attempts at filling the glass any further would be fruitless, not to mention wasteful and lead to leakage.
We see biases exploited every day in life. You need only to turn on a professional sporting event to see those exploitations in action. Coaches play off the biases and assumptions of other coaches to gain a strategic advantage. Pitchers and batters engage in a battle of wits to see who can win. With every pitch, the pitcher attempts to deceive the batter. Back to another bias, we consider that the pitcher tries to outpitch the batter, not deceive him.
Organizationally speaking, because of the size and scope of government cyber defense organizations throughout the world, they are susceptible to deceit based on individual (and organizational level) biases.
Military planners, because of their responsibilities and training, also tend to concentrate primarily on purely military factors that influence combat. Officers thus often see the world through lenses that filter out important political considerations that can (and should) influence strategic decisions and military outcomes in war.
—Scott D. Sagan, in “The Origins of Military Doctrine and Command and Control Systems,” Planning the Unthinkable
Totally Full Version
Have you considered the argument that the glass is totally full and can never be anything except totally full?
image
A small challenge to our personal biases and thought processes will lead us to consider the fact that when we say the glass is half empty or half full, we really are analyzing only the portion of the glass that contains the liquid. We are taught not to consider the portion of the glass that contains the air from the atmosphere.
When thinking about your enterprise and the systems, data, and critical resources within, do you ever consider your enterprise as being the proverbial half full or half empty? By nature, leadership desires the glass to be full (or your enterprise is perfectly secure and running smoothly), and the professionals in the trenches know of all the gaps and weak points in the enterprise, which make their security posture “half full.”
Step-Beyond Version
Is it possible that a glass could be more full or less full with the same volume limitation? Consider Charles’s law of gases, which explains how gases expand when heated (the same is true for liquids):
100 − V0 = kV0
where:
imageV100 equals the volume occupied by a gas at 100°C.
imageV0 is the volume of the same gas at 0°C.
imagek is the constant for all gases at a constant pressure.
As the temperature in a given specified volume increases, the molecules become more active, which require more space.
It’s déjà vu all over again!
—Yogi Berra
Conversely, as the temperature decreases, we see less activity (movement of molecules). Here, we observe that it takes less volume to hold the mass of gas. Theoretically, you could add more gas or put in a small amount. So, by adjusting the variables, your glass could contain more or less gas in the same volume—thereby adding to the question of whether the glass is half empty or half full. How you look at your enterprise can be similar.
With all of the systems transmitting and moving packets about your enterprise, it may be difficult to detect a specific type of network activity. This is why the more educated and skilled threats will operate during your business hours to hide within all of the noise. Those who are less educated or skilled will perform actions during nonbusiness hours, which increases the probability of detection by the defenders.
Two-Steps-Beyond Version
This is neither pig nor pork. It’s beef!
—Oliver Hardy as Ollie Dee in Babes in Toyland
We profess that all the preceding versions are right and wrong, simultaneously. How can that be? What is the truth? The truth is what is factually sound to you. Perhaps we should consider the glass is twice as big as it needs to be to hold the liquid. Ouch! Did that even come to mind?
The picture is as clear as you dare see it. Specificity can lead to a clearer picture as understood by Sun Tzu; however, with more information, there is a possibility that the picture could become more clouded as Clausewitz believed, or even worse, you have wasted resources on clarifying something to the nth degree that was sufficient to accomplish the mission with a fraction of that investment.
Of course, there are numerous possibilities, but this exercise was undertaken to show how we must break those biases if we are to be successful in fully deceiving the adversary to accomplish the mission.
Conclusion
He’s not the messiah; he’s a very naughty boy!
—Monty Python’s Life of Brian
People and things are not always what they seem. No one has a perfect method for safeguarding information and controlling the information environment. This book is a good first step in that direction. Information is relative and should be consumed as such. You can have too much information and too little, and those states are not mutually exclusive. If this seems confusing, just wait until we get to the next chapter. The important point is that the information environment is fluid. Professionals with good situational awareness and their wits about them will not be easily fooled.
Lesson 15, Part I: Use the formula P = 40 to 70, in which P stands for the probability of success and the numbers indicate the percentage of information acquired. Lesson 15, Part II: Once the information is in the 40 to 70 range, go with your gut. This is about balancing data and information gathering with your instincts. Learn to trust your gut (which is about trusting your experience). Sure you’ll make mistakes, but you’ll also learn. In the real world, you don’t have infinite time to explore every problem until you have all the possible information. Instead, it’s about satisfying to get things done. You have to find potential solutions that fit and test them against reality to see what sticks.
—Colin Powell, Lessons on Leadership
Colin Powell understood that the information picture can get clouded as more information is folded into the mix. There is a point where enough information is enough, and a decision is required! Information is vital to the decision-making process, but weighing the quality of that data relative to the situation is where the true professional excels. Experience and training go a long way in improving an individual’s situational awareness, but there is quite a bit to be said for individual intellect. Common sense and a level head go far when dealing with critical and conditional decisions.
When the rabbit of chaos is pursued by the ferret of disorder through the fields of anarchy, it is time to hang your pants on the hook of darkness. Whether they’re clean or not.
—Spice World
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset