The Information Battlespace Preparation Experiment
Dorene L. Kewley
Roderick A. Moore
dkewley@bbn.com
rmoore@zeltech.com
Denise Snyder
J. Kendree Williams
kwilliams@zeltech.com
Zel Technologies, LLC
dsnyder@bbn.com
BBN Technologies, A Verizon Company
Laura S. Tinnel
Raymond C. Parks
ltinnel@teknowledge.com
rcparks@sandia.gov
Teknowledge Corporation
Sandia National Laboratory
Abstract
As a part of the Cyber Command and Control
(CC2) initiative within the Information Assurance and
Survivability program, DARPA is sponsoring
development of a prototype Intelligence Preparation
of the Information Battlespace (IPIB) application.
This effort attempts to map the kinetic warfare
Intelligence Preparation of the Battlespace (IPB)
process to cyber defense. This project addresses
three issues: whether it is feasible to identify
cyberspace analogs for kinetic concepts such as
terrain, weather, and adversary doctrine; whether an
IPIB prototype would assist analysts to perform
cyberspace IPB, and whether IPIB improves cyber
defenses. The CC2 team conducted a whiteboard
experiment to address the third, efficacy issue. This
paper describes the experimental process, hypotheses
to be tested, conduct of the experiment, results, and
conclusions. We provide some lessons learned that
should be of interest to the Information Assurance
(IA) community at large.
1. Introduction
The Intelligence Preparation of the Information
Battlespace (IPIB) project, under the Cyber Panel
initiative of the DARPA Third Generation Security
program (formerly the Cyber Command and Control
initiative of the Information Assurance and
Survivability program), has three main objectives:
analyze and confirm the feasibility of a domain
transition of IPB from kinetic space to cyberspace,
develop and evolve a prototype automated IPIB
application, and demonstrate the efficacy of IPIB through
controlled experiments.
Zel Technologies, LLC (ZelTech) is transitioning an
automated application of the Army's classic IPB
methodology [1,2] from Kinetic Warfare (KW) to
Information Warfare (IW)[3]. In conducting military
operations in the physical world, the US government and
its military forces have developed, refined and applied a
number of time-tested intelligence analytical processes and
tools that have provided critical support to operations and
helped ensure victory. If a domain transfer from real
space to cyberspace were to be feasible, the resulting IPIB
products would offer commanders an opportunity to
switch cyberspace defense from a reactive to a proactive
mode. Put simply, IPB is intended to help a commander
know where to look in the battlespace, when to look, and
what to expect to see. When we consider the nanosecond
timelines for cyber attacks, this would be particularly
beneficial because it would permit the commander to
anticipate an attack and establish countermeasures before
the attack takes place.
Thus, the underlying precepts of the IPIB effort are that
traditional Kinetic Warfare IPB processes can be adapted
to cyberspace and will provide benefits to Information
Assurance (IA) beyond those achieved in current security
best practices.
ZelTech has reported the feasibility of a domain transfer
in earlier IPIB efforts [4,6,8], describing key concepts of
IPB in kinetic warfare, their cyberspace analogs, and
providing an IPIB system architecture for implementing
the continuous, four-step IPB process for IA. With regard
to the IPIB prototype, several demonstrations have been
conducted, and an evaluation release is scheduled for early
2001.
Proceedings of the DARPA Information Survivability Conference and Exposition (DISCEXII’01)
0-7695-1212-7/01 $10.00 © 2001 IEEE
1.1 Objective
This paper reports the efforts of several players in
the CC2 initiative to address the benefit issue: Does
IPIB offer any advantages in cyber defense? The
CC2 Experiment Working Group (EWG) assisted
ZelTech in formulating and executing a quantitative,
hypothesis-based whiteboard experiment to ascertain
the impact of performing IPIB. The overall goal of
the CC20008 Information Battlespace Preparation
experiment was to determine if the IPIB process
improves network defense beyond security best
practices.
In the operational environment today, network
defense is reliant on security experts who make
defense recommendations based on engineering
judgment and what are believed to be industry best
practices. Because there is no uniform tested and
proven process for network defense, recommended
protection plans can vary greatly among security
experts. The lack of a tried and proven process yields
an unrepeatable and potentially haphazard
deployment of defensive measures and thus reduced
assurance of the defense of mission critical assets.
IPIB provides a systematic, continuous process of
analyzing the network environment and threat. It is
designed to help the network defender selectively
apply and maximize his offensive and defensive
information warfare capabilities at critical points in
time in cyberspace first by determining a threat’s
likely course(s) of action (COAs), then translating this
threat information into a reasonable defensive posture.
This experiment illuminated potential high value
uses for the IPIB process in network defense. Two
areas where it was judged that IPIB may add value
were in aiding the network security novice and in
aiding the security expert in network defense.
1.2 Hypotheses
The experiment attempted to confirm or deny an
overall hypothesis and four sub-hypotheses:
• The IPIB process improves network defense
beyond security best practices.
– Detection mechanisms configured using the
IPIB process improve detection capability
beyond security best practices.
– Prevention mechanisms configured using the
IPIB process improve prevention capability
beyond security best practices.
– The IPIB process predicts attack targets better
than security best practices.
– The IPIB process predicts enemy courses of action
better than security best practices.
A quantitative assessment strategy was developed to
measure performance relative to these hypotheses
objectively.
2. Experimental Approach
We followed an experimental approach developed by
the CC2 Experiment Working Group (EWG) for conduct
of CC2 experiments. This involved submission of a formal
experiment proposal and plan which was reviewed by the
EWG and approved by the DARPA CC2 Program
Manager. Full details about the experiment are available
at the DARPA 3GS archival web site [6]. This paper is an
abbreviated summary.
2.1 Experimental Methodology and Metrics
This whiteboard experiment used three blue teams to
compare the use of the IPIB process in implementing and
executing a network defense strategy with current security
best practices:
• Expert Blue Team (EBT): Network security experts
who employed security best practices;
• Novice + IPIB Blue Team (NBT): Network security
novices aided by the IPIB process; and
• Combined Expert + IPIB Blue Team (CBT):
Members from the NBT and EBT to allow the experts
to apply the IPIB process.
The Expert Blue Team was the baseline datum from
which the Novice and Combined Blue Teams were
compared. We expected that the IPIB process would
empower a network security novice to construct an
adequate network defense plan, as well as adequately
predict expected adversary courses of action and targets.
We also expected that the IPIB process would empower
network security experts to construct an improved network
defense strategy and predict adversary courses of action
and targets.
A Red Team was assembled to devise multiple attack
strategies which would successfully disrupt the mission
stated in the scenario using cyber means. The Red Team
constructed attack trees with multiple nodes, paths, and
targets. Twelve of these attacks were executed on paper,
and each of the three blue teams was scored by the
Assessment Team in the following four categories:
• Effectiveness at detecting attacks;
• Effectiveness at preventing attacks;
• Prediction of adversary targets; and
Proceedings of the DARPA Information Survivability Conference and Exposition (DISCEXII’01)
0-7695-1212-7/01 $10.00 © 2001 IEEE
• Prediction of adversary course of action.
The ranking scheme was on a scale of 1 to 5, with 1
representing ineffective and 5 representing highly
effective.
The Expert and Novice Blue Teams were provided
the experiment plan, scenario, and toolbox, and asked
to devise independent network defense strategies, as
well as identify what the expected adversary COA and
targets would be. Both teams were given the
opportunity to ask clarification questions of the
Assessment Team.
Once both Blue Team network defense strategies
were submitted, a subset of the Assessment Team met
to review both strategies and deemed that they were
different enough for the experiment to continue. The
Combined Blue Team then met to devise a combined
network defense strategy using both current security
best practices and the IPIB process. During this
strategy session, the Combined Blue Team decided
that the IPIB process was superior to the Expert Blue
Team’s strategy for predicting the adversary COA and
targets. For this reason, the Novice Blue Team’s
COA and target predictions were repeated in full by
the Combined Blue Team.
Meanwhile, the Red Team was also provided the
experiment plan and scenario, and was asked to
develop cyber attacks on the network. The same
attack set was used against all three Blue Team
defense strategies during the experiment assessment
phase.
2.2 Scenario
The CC20008 experiment scenario was titled
“Operation Dragon Slayer: JTF Concept Plan.” It
was based on the notional scenario employed during
the CC20006: Mission Impact Assessment
Experiment executed in December 1999. An overview
of the scenario is provided herein.
The Commander in Chief, Pacific (CINCPAC) has
directed formation of Joint Task Force (JTF)
Dragonslayer. The JTF concept plan is entitled:
OPLAN 4030 “Defense of the Philippines - Chenho
attack within 10 days.”
Summary:
US intelligence indicates that the Chenho
adversaries are threatening the Spratly Islands, a
subset of the Philippine islands. The US expects:
• the Chenho will invade the Philippines.
During deployment, the US forces will:
•
•
•
•
travel toward the Philippines,
use mid-air refueling and stop in Guam for crew rest,
form an Air Operations Center in Guam,
continue unit level force movement to the Philippines
and establish a Wing Operations Center, and
• conduct conventional air operations to defend the area.
US intelligence reports the following:
• Chenho is a powerful, technologically advanced nation
state,
• has a full complement of armed forces including a
powerful air force supplied by France and Russia,
ground- and air-to-air missiles, and a nuclear
submarine capability,
• have been conducting frequent overflights of the
Spratly Islands and Philippine national airspace,
• Chenho military has been on full alert for over three
months and its forces are in a high state of readiness,
• Chenho armed forces have been conducting exercises
North and West of the Philippines for three weeks and
landing craft have been observed fueling in the area,
• there has been frequent testing between Chenho and
Philippine air forces including two recent air-to-air
missile launches, and
• Chenho has recently stepped up its public
pronouncements against Philippine intransigence on
the Spratly issue.
The United States has deployed forces to assist in the
conflict. During this deployment, they will rely heavily on
the TransPacific communication infrastructure between
the US, Guam and the Philippines. They need to protect
both local and theater domains, and to provide reachback
for support, logistics and intelligence. They also must
exchange task orders, reports, and logistical information
on deployment and protection of the troops and network
assets.
It is the job of the Blue Team to adequately protect the
continental US (CONUS) and deployed network assets
against any adversary attempt to disrupt the
communications and thus the mission. They must defend
these network assets using readily available network
security mechanisms to both prevent and detect adversary
actions. They also must attempt to determine the
adversary’s target and likely courses of action to develop
effective network defense strategies.
• the Philippine government will defend territorial
rights,
• a devastating conflict will develop, and
Proceedings of the DARPA Information Survivability Conference and Exposition (DISCEXII’01)
0-7695-1212-7/01 $10.00 © 2001 IEEE
2.3 Rules of Engagement
The Blue Teams were restricted to using only
commercially available or current DARPA prevent
and detect mechanisms used in the Information
Assurance program. They were not allowed to invent
or presume IA capabilities that were not already in
existence or easily created using standard tools such
as shell scripts. A list of allowable tools in the
following categories was specified:
• Network, host, and application-based "Prevent"
countermeasures and intrusion detection systems
(IDS)
• Miscellaneous network security tools
• Security hardware
To facilitate assessment and comparison, blue
teams did not consider acquisition or support costs in
developing their static defense strategies.
2.3.1 Blue Teams. The composition and mode of
operation of the three Blue Teams was as described
here.
Expert Blue Team. The Expert Blue Team was
made up of a network security expert currently
working on the DARPA Information Assurance
program, and a network security expert employed at a
local commercial network security consulting firm.
This commercial representative spent several years in
the Army and had valuable insight into the military
operations defined in the scenario. This Expert Blue
Team used current security best practices to devise
their network defense plan.
In general, the Expert Blue Team spent most of
their effort in carefully selecting, placing, and
configuring prevent mechanisms in the various LANs.
They used some detection tools as well. Their
process for determining adversary COA and likely
targets was based primarily on engineering judgment.
Novice Blue Team. The Novice Blue Team was
composed of three people whose expertise lies in the
IPIB process and networking, but not information
security. All three team members are part of the team
which is developing the IPIB process.
In general, the Novice Blue Team took the
approach of placing a large number of detect
mechanisms throughout the network, thus heavily
instrumenting the network. They used some prevent
mechanisms as well. It was assumed that this Blue
Team would be able to adequately configure the
prevent and detect mechanisms. This team developed
detailed course of action and target prediction documents
in following the IPIB process.
Combined Blue Team. The Combined Blue Team
consisted of the commercial network security expert from
the Expert Blue Team, and one of the security novices
from the Novice Blue Team. These team members
devised a combined network security plan using the best
strategies both teams had to offer. The security expert
offered primarily sensor placement and configuration
strategies, while the novice offered the results of the IPIB
process primarily in COA and target prediction.
It is important to note that the security expert did not
individually walk through the IPIB process. Rather, as a
joint effort, the results of the Novice Blue Team network
defense strategy and IPIB process output were evaluated
from a security expert’s perspective. The joint strategy
was then devised.
2.3.2 Red Team. The Red Team was composed of a
group of Red Team experts who not only work on the CC2
Program on a regular basis, but also have extensive
experience with Red Team experimentation on the
Information Assurance Program, as well as with various
commercial customers. Their procedures for developing
attack trees are well established and tested. For this
experiment, they evaluated the experiment scenario and
determined what the most likely targets would be, then
devised multiple attack paths to get there.
The Experiment plans, scenario, all Blue Team defense
plans and Red Team attack trees, with detailed
descriptions, are posted on the Cyber Panel web site [6].
3. Execution
The Expert Blue Team and Novice Blue Team used
their respective processes to prepare defense strategies
independently. Then, a representative of each met to form
the Combined Blue Team, in an attempt to allow an expert
to apply the IPIB process. They prepared a Combined
Blue Team defense plan. In parallel, the Red Team
prepared a suite of attacks. When all four teams had
completed their execution steps, the Assessment Team
evaluated the results. The processes and products of each
team during execution are described in detail below.
3.1. Novice Blue Team Process and Products
Because it was the central reason for conducting this
experiment, we describe the IPIB process first, and in
some detail. We discuss several of the IPIB products a
Commander or decision-maker (in the Chenho scenario,
Proceedings of the DARPA Information Survivability Conference and Exposition (DISCEXII’01)
0-7695-1212-7/01 $10.00 © 2001 IEEE
the JTF Commander and his staff) might use.
IPIB is based on the rigorous Intelligence
Preparation of the Battlefield process described in US
Army Field Manual FM 34-130. It is a continuous
process composed of four standard steps as shown in
Figure 1. We have added a step in the center of the
figure to emphasize that actionable Intelligence must
be applied if the preparation effort is to have a payoff.
The process can be performed manually, or supported
by automated decision support tools like ZelTech's
IPIB prototype. The current iteration of the IPIB
prototype is concentrated on mission and commandlevel decision making, mainly prior to hostilities.
Later iterations will focus on the details of specific
attacks and identifying low-level Enemy COAs as
they are occurring.
Define Battlespace
Environment
1
Focus
Determine
Threat
COAs
4
Integrate
Apply
IPIB
2
Influences
Describe
Battlespace
Effects
3
Model
Evaluate
the Threat
Figure 1. The IPIB process is continuous
Philippines.
The central AOC functions are to plan the next day's air
The Chenho Experiment. CC20008, based on the
missions, disseminate the plan in an Air Tasking Order
Chenho scenario, provides an excellent means of
(ATO), and perform Force-Level Battle Management of
illustrating key points about IPIB.
the execution of today's ATO. To perform these, AOC
IPIB Step 1. We commence with IPIB Step 1,
personnel must also maintain situation awareness, execute
whose activities are listed in Figure 2. The scenario,
intelligence activities, provide logistics support, and
provided in section 2.2, sets the stage as we
maintain several kinds of communications networks.
commence collecting and organizing information
At the WOC, the central mission is to direct the Air
about the environment. With respect to organization,
Wing
and Squadrons as they fly the missions assigned in
mission and geopolitical context, we know from the
the
current
ATO, and keep the boss informed about
scenario that CINCPAC has directed that a JTF, with
mission,
aircraft,
airfield, and weapons status.
a Joint Force Maritime Component Commander
Both
Operations
Centers have automated C2/Battle
(JFMCC) and Joint Force Air Component
Management
(BM)
support
in the form of the Theater
Commander (JFACC), deploy to the Philippines and
Battle Management Core System (TBMCS), an
conduct air operations to prevent the Chenho
architecture that complies with Global Command and
"invasion". The Land Component Commander is
Control System/Defense Information Infrastructure
unavailable due to commitments elsewhere. To
Common Operating Environment requirements. Neither
manage the scope of the experiment, JFMCC
TBMCS enclave has a physical existence until the
operations are to be considered out of bounds. The
deployed forces arrive and set it up. Both Centers also
JFACC is to execute OPPLAN 4030, which draws air
have Logistics missions and non-TBMCS logistics support
assets from three Air Force bases in CONUS, and will
systems.
use Travis AFB for airlift. His forces will transit the
High level information about the threat also comes from
Pacific, establish an Air Operations Center (AOC) in
the
scenario.
Guam, and a Wing Operations Center (WOC) in the
The environment data collected in Step 1
1. Identify the limits of the joint force’s operational area
includes
information about related, associated
2. Analyze the joint force’s mission and joint force commander’s intent
3. Determine the significant characteristics of the joint force’s
networks that can have an impact on the JTF's
operational area
mission. In this scenario, those would include
4. Establish the limits of the joint force’s areas of interest for each
four continental US airbases, CINCPAC
geographic battlespace dimension
headquarters, Defense Logistics Agency,
5. Determine the full, multi-dimensional, geographic and nongeographic spectrum of the joint force’s battlespace
service Logistics Centers, and any in-country
6. Identify the amount of battlespace detail required and feasible within
commercial or military networks that might be
the time available
connected to the AOC or WOC. One would
7. Evaluate existing data bases and identify intelligence gaps and
also collect and organize information about
priorities
8. Collect the material and intelligence required to support further
allies, neutrals, and nations or organizations
JIPB analysis
likely to cooperate with the enemy as part of
the geopolitical context. Environmental data
Figure 2. IPIB Activities performed in Step 1
Proceedings of the DARPA Information Survivability Conference and Exposition (DISCEXII’01)
0-7695-1212-7/01 $10.00 © 2001 IEEE
might include available communications links, 1. Analyze the battlespace environment
bandwidth, cyclic loads and projected periods of
Analyze the military aspects of each dimension
•
heavy use (Cyber climate/weather).
Evaluate the effects of each battlespace dimension on military
•
Items 7 and 8 of Figure 2 indicate intelligence
operations
collection. The IPIB tool has a "Request for 2. Describe the battlespace’s effects on adversary and friendly
Information (RFI)" feature that enables the user to
capabilities and broad courses of action
generate RFIs in a format accepted by the
Figure 3. IPIB activities for Step 2
intelligence and C2 communities and transmit it to
represents the AOC prior to emplacement of
the relevant organization over available C2 and
countermeasures like firewalls or Intrusion Detection
intelligence communications channels.
Systems (IDSs). Similar network diagrams were prepared
The majority of the information collected in Step 1
for the WOC and all related networks.
is contextual or high-level information about networks
To analyze influences on his mission, the IPIB analyst
and the operational environment.
maps
a model of critical functions to the information
IPIB Step 2. Step 2 activities are listed in Figure 3.
system's
(IS) hardware, software, and communications
In kinetic warfare, this is where the intelligence
architecture.
When combined with the results of the
analyst examines the terrain, obstacles, possible
vulnerability
assessment,
this mapping allows assessment
avenues of approach, cover and concealment, and
of
the
impact
of
uncorrected
vulnerabilities on the critical
other features of the battlespace that could influence
IS
components
of
the
supported
mission.
friend and foe alike. For defensive IA, the terrain
Step
2
incorporates
two
types of vulnerability
analog would be the defended network and any
assessment
an
automated
scan
in
the case of a defended
closely related networks. Figure 4 is an example of
network
that
has
already
been
constituted,
and an analysis
the AOC network. Other than crypto, Figure 4
for a network that has no physical existence (during
Communications Layout – Andersen AFB, Guam
AOC Compound
Ops
Intel
Tech
Control
Mux
Crypto
SATCOM
Mux
Crypto
SATCOM
Logistics
Plans
Mux
AOC Primary
Router
Crypto
SATCOM
Tactical Communications
Core Cell
Base Comm.
Center
(Voice, Data, DMS, TADIL, etc)
Demarc point
CINCPAC
Base Support
Enclaves
Base LAN
C
S
U
/
D
S
U
CONUS
Guam Public
Telephone System
Philippines
Leased Communications
Circuits
Figure 4. High-level view of the Air Operations Center (AOC)
Proceedings of the DARPA Information Survivability Conference and Exposition (DISCEXII’01)
0-7695-1212-7/01 $10.00 © 2001 IEEE
development of an Annex K of an OPLAN, for
instance, or the pre-deployment phase of the Chenho
scenario).
The results provide a first level of defense - a punch
list of vulnerabilities that should be corrected, or
require countermeasures to be in place.
One can analyze the critical functions, network
topology, and vulnerabilities to define potential
enemy avenues of approach. The results of Step 2
analysis and data collection are stored as battlespace
effects for subsequent use.
IPIB Step 3. Adversary modeling activities are
listed in Figure 5. In IPIB, these translate to
collecting and organizing information about adversary
goals; organization and training; historical doctrine,
tactics,
techniques,
and
procedures;
and
characteristics. These are analyzed to provide an
adversary model that can help to reduce the number of
possible avenues of approach and attacks.
1.
Update or create adversary models
2.
Determine the current adversary situation
3.
Identify adversary capabilities
Figure 5. IPIB Step 3 Activities
For the purposes of this experiment, the scenario
builders had to create simulated historical data on
Chenho tactics, techniques and procedures,
organizational structure, doctrine and strategy. In the
real world, unlike the kinetic environment with
extensive information sources on a variety of
adversaries,
there is little data or intelligence
information on potential cyber adversaries that one
can use.
However, one can draw reasonable
conclusions in spite of the lack of intelligence by
considering enemy characteristics, and in the future,
we anticipate intelligence collection will provide more
emphasis on cyberspace issues. The Novice Blue
team issued RFIs to demonstrate the kind of
intelligence information required.
We used the IPIB tool to collect information about
the Chenho's objectives, degree of risk of exposure
they were willing to accept, capabilities (equipment,
expertise, software, funding, time, etc.), and access to
the defended system. Using this, we developed
conclusions about how they would act in the Predeployment, Deployment, and Operations phases of
the scenario.
Key conclusions were that::
•
Chenho would attempt to delay or disrupt the JTF
through low-risk, easy attacks on related
•
•
•
•
networks rather than on the classified side of the AOC
and WOC. We expected continuation of the historical
Chenho trend of looking for low-hanging fruit – the
majority of effort was expected against unclassified
networks, logistics and re-supply, personnel
administration, civil infrastructure and relations with
host nation.
Chenho would expend a major effort to prevent the
deployment at the points of embarkation in CONUS,
through attacks on those related networks. They
would also attack related en-route networks to create
chaos during deployment.
Chenho would employ insiders at all phases, with
increasing acceptance of risk of exposure as air
operations commenced.
Chenho would be averse to detection until active
hostilities broke out. At that point, overt Information
Operations, including risk to insider moles and
obvious DoS attacks, are more likely. This increases
the probability of detectable social engineering, overt
insider damage, DoS attacks, and User-to-Root
transitions for data destruction, to disrupt
development, dissemination and execution of the
ATO.
Avenues of approach for attacks on classified
networks, tactical communications, and the
Operations, Planning, and Intelligence cells (see
Figure 4) would be from enemy insider moles, or
from introducing problems (e.g., malicious code)
across the air gap from the unclassified side by dupes
or unsuspecting Blue forces.
IPIB Step 4. All the other steps collect and organize
information so that this step can be performed. Step 4
activities are shown in Figure 7. In this step, using IPIB,
the analyst develops enemy COAs (EnCOA) in the form of
Attack Trees, as described by Schneier [9]. Each attack
tree node has properties, such as feasibility, cost, risk of
exposure, and amount of damage to the mission (lethality,
in kinetic warfare targeting). EnCOAs can be evaluated
by these properties to select the most likely and/or the
most damaging for further development.
By comparing the EnCOAs to the defended network
architecture, the analyst can identify Named Areas of
Interest (NAI), where enemy activity is likely to be
observed. For each NAI, one can identify potential
observables - IDS reports, effects of attacks, etc. Since
NAIs are associated with one or more EnCOAs,
observations provide indicators of which ones are in play
during attacks. Thus, EnCOAs are hypotheses, and NAIs
provide intelligence/surveillance collection points for
preloading multiple hypothesis reasoners or fusion systems
for situation awareness. Attack tree information can be
Proceedings of the DARPA Information Survivability Conference and Exposition (DISCEXII’01)
0-7695-1212-7/01 $10.00 © 2001 IEEE
extracted and sorted in tabular displays other than
• Define critical mission functions and associated
NAI lists and EnCOA lists. The IPIB design
network entities, instrument and monitor key
incorporates two other products from kinetic warfare
components.
IPB - Event Matrices and Countermeasure
• Develop a comprehensive cyber defense strategy
Synchronization Matrices. Event matrices are timeintegrating operational, information, and physical
sequenced NAIs, based on predecessor-successor
security solutions to provide broad range of defensive
relationships in the EnCOA/attack tree, and are sorted
coverage based on adversary potential courses of
by expected time of occurrence. Synchronization
actions and objectives.
matrices display essentially the same information, but
• Where the indicator or observable is effects based and
are keyed by countermeasure and displayed as a Gantt
not covered by existing prevention or detection
chart with effective times of activating and
technologies, implement operational practices and
deactivating countermeasures.
procedures to detect and mitigate effects (respond and
For the Chenho experiment, the NAI list, event and
recover).
countermeasure synchronization matrices were
• Attempt to set up an indications and warning process
generated manually in the form of Excel spreadsheets
outside the defended networks through cooperative
from parsing the EnCOA attack trees. Space does not
agreements with related networks and providers.
permit reproducing these matrices or the attack trees
• Restrict everything unless specifically authorized.
here. The attack trees alone consisted of over forty
The Novice Blue Team expended on the order of 100
pages.
labor
hours in developing the CDP. This included
The authors should point out that, through pure
preparation
and editing of the CDP as a finished
oversight, there were no life cycle attacks postulated
in the IPIB process. Any credit
1. Identify the adversary’s likely objectives and desired end state
achieved in the assessment phase for
defenses against them must go to
2. Identify the full set of courses of action available to the adversary
standard practices and procedures,
3. Evaluate and prioritize each course of action
not specific consideration of
4. Develop each course of action in the amount of detail time allows
adversary intent.
IPIB Step 5. The above IPIB
5. Identify initial collection requirements
products
provide
actionable,
Figure 6. IPIB Activities for Step 4
predictive intelligence as a basis for
developing a full-blown Cyber
operations/intelligence product. The prototype IPIB tool,
Defense Plan (CDP). First, they provide for another
when completed, is intended to substantially reduce this
pass through Step 2 to implant additional Prevent
labor.
countermeasures. Second, they indicate standard
operating practices and procedures for technical,
3.2. Expert Blue Team Process and Outputs
administrative, physical, and operational security.
As previously mentioned, the Expert Blue Team was
Then, they suggest high payoff points for IDS sensor
made
up of a network security expert currently working on
placement - to detect attacks against the Mission, not
the
DARPA
Information Assurance program, and a
mere network attacks.
The countermeasure
network security expert employed at a local commercial
synchronization matrix provides guidelines for
network security consulting firm. This Expert Blue Team
dynamic IA policies - if these events are seen, then do
used current security best practices to devise their network
this:
change INFOCON, close port, switch to
defense plan.
alternate service, etc. For the Chenho experiment, the
In general, the Expert Blue Team spent most of their
Novice Blue Team generated a CDP manually as a
effort in carefully selecting, placing, and configuring
Microsoft Word document with Excel and
prevent mechanisms in the various LANs. They used
PowerPoint attachments. Based on kinetic warfare
some detection tools as well.
Their process for
Area Air Defense Plans, the full CDP is posted in the
determining adversary COA and likely targets was based
CC20008 Experiments folder at the DARPA web site.
primarily on engineering judgment. Their defense strategy
As far as we can determine, this is the first instance of
was to:
an organized CDP or cyber playbook, and in
particular, one developed with the aid of IPIB.
• Employ proven standard practices, procedures, and
The Novice Blue Team strategy was to:
solutions to a tightly defined network environment.
Proceedings of the DARPA Information Survivability Conference and Exposition (DISCEXII’01)
0-7695-1212-7/01 $10.00 © 2001 IEEE
• Strictly control access with firewall rule sets.
• Use network, host and applications based detectors
and prevent tools to monitor environment,
concentrating on prevention.
• Monitor and audit boundary controls with
extensive logging and analysis, and
• Restrict everything unless specifically authorized.
The Expert Blue Team expended less than forty hours
in the execution phase prior to assessment.
3.3. Combined
Outputs
Blue
Team
Process
and
As mentioned in 2.3.1.3, the Combined Blue Team
consisted of a commercial network security expert from
the Expert Blue Team and one of the security novices
from the Novice Blue Team. The Combined Blue Team’s
process consisted of discussing key components of the
IPIB process with a view to maximizing the expert's
technical knowledge of best security practices and
procedures through IPIB. We reiterate that the security
expert did not explicitly use the IPIB process. Due to
resource constraints, the results of the Novice Blue Team
network defense strategy and IPIB process output were
evaluated from a security expert’s perspective. The joint
strategy was then devised, with the intent to create the best
defensive strategy possible by integrating knowledge
about the mission critical functions derived from the IPIB
Table 1. Blue Team Defense Strategies
Team
Expert Blue
Team
Novice IPIB
Team
Combined
Expert Blue
Team and
IPIB Team
Detailed Strategy Components
• Network based detect and prevent tools including separate Firewalls and VPNs between network segments (i.e.
Plans, Operations, Intel, Logistics, Core Cells each had separate VPN)
• Host and applications based detect and prevent tools.
• Network Security tools (e.g. anti-virus software)
• Hardware (e.g. switches and encryption devices. Separate Log Server in Core Cells.)
• Highly specific security engineering best practices/solutions to include disabling application services;
hardening O/S; logging and monitoring of boundary controller activities.
• Primary concern was defending against externally based adversary.
• Use IPIB process to define critical mission functions and associated network entities, identify critical nodes and
applications warranting defensive instrumentation and monitoring of both external and internal infrastructure.
• Network based IDS and prevent tools including Firewall at Primary router and single VPN at AOC/WOC.
• Host and applications based detectors and prevent tools, esp. Tripwire or equivalent.
• Network Security tools (e.g. anti virus software)
• Hardware (e.g. switches and encryption devices. Logging assumed, but no separate Log server.)
• Augment technical best practices and solutions with operational procedures and practices to maximize
defensive posture. (e.g. two man rule for critical database transactions, manual checks for effects-based
observables). Technical practices more general (less detailed and weaker) than EBT.
• Implement restraint and constraint metrics, log and monitoring functions for critical data base entities to
mitigate insider risk.
• Institute enterprise wide (external, civilian and related military networks) manual and message-based
indications and warning process for attack and countermeasure coordination and execution.
• Primarily concerned with internal (authorized/co-opted users), externally based adversaries
masquerading as such, and maintaining mission-critical operations, even when under attack.
• Integrate IPIB derived products to assist in the selection, placement of detect and prevent IA
sensors – best of both worlds approach. See EBT Network based, application based detectors and
prevent details, Network Security tools and hardware above, but with added focus on mission critica
applications/capabilities. Augment EBT engineering best practices and solutions with operational
procedures and practices from IPIB Team.
• Add duplicate, hot standby for Primary Routers at entry of AOC/WOC (identified as single points of failure).
• Implement restraint and constraints metrics , monitoring and auditing critical data base entities to mitigate
potential insider based attacks.
• Discussed use of MANTRAP honeynet, but did not present formally in final defense plan.
• Provide defense against insider (authorized/co-opted users) and externally based adversaries, including
masqueraders.
Proceedings of the DARPA Information Survivability Conference and Exposition (DISCEXII’01)
0-7695-1212-7/01 $10.00 © 2001 IEEE
process with the technical IA skills for detection and
prevention from the Expert Blue Team.
The
Combined Blue Team used the Novice Blue Team’s
COA and target predictions directly, and the IPIB
process itself assisted the Combined Blue Team in
placement of countermeasures. The two sets of
Standards and Procedures were merged to create a
best of both worlds approach.
The Combined Blue Team expended less than twenty
labor hours to develop their defenses.
Table 1 summarizes and compares the defensive
strategies of all three teams.
3.4 Red Team Process and Outputs
The Red Team was composed of experienced
penetration experts who applied their standard process as
described in [7], adapted to emulate an enemy nationstate rather than a cyber-terrorist. They prepared an object
model of the AOC and WOC in an effort to determine
choke points and high-value targets.
Then they
determined goals and sub-goals and prepared attack trees
Table 2. Red Team Attacks
ID
A
B
C
D
E
F
Objective
Compromise DMZ
Physical Access to POP required
Sniff, network discovery
Scan DMZ, map
Attack vulnerabilities
Modify Database
Release Virus into the Wild
Release Virus
Set up Man in the Middle
Trojan anti-virus software
Modify application (power point) DOS attack
Lifecycle Attack
Sniff traffic/Identify ISP
Plant human insider for intel., or takeover ISP
Modify application
Insider Giving Re-routing Attack
Sniff/identify ISP
Plant human insider/takeover ISP
Plant Trojan SW/HW
Map network
ID
G
Attack vulnerabilities
Access system
Modify Database
Compromise Home System
Identify home system
Gather intelligence from non-cyber means
Exploit home systems, launch from there
Access Critical System
Modify Database
Create own ISP, bogus OSPF
Identify ISP
Create new ISP
Root on router
Bogus OSPF routing (DoS)
J
H
I
K
L
Objective
FW, Scan Discovery and Exploit
Identify ISP
Trojan on ISP’s POP/router
Sniff
Firewalk firewall
Scan, Map Network
Exploit Vulnerabilities
Access critical System
Modify Database
Back Door Connection
Identify DNS and IP
Scan for back doors
Map network
Access critical system
Modify Database
Insider Router Mod
Human insider places Trojan on router
Map network
Access critical system
Modify Database
Phone Switch Attack
Identify phone, HVAC services
Plant human insider
War-dial for modems or get intel. from insider
Access phone switch
Modify Database
Trojan E-mail
Identify home locations and users
Crack into home systems
Send Trojan e-mail from home to targets
Access target via Trojan
Modify Database
Database Update
Identify targets
Plant human insider
Plant Trojan
Map network
Exploit vulnerabilities
Access critical system
Modify Database
Proceedings of the DARPA Information Survivability Conference and Exposition (DISCEXII’01)
0-7695-1212-7/01 $10.00 © 2001 IEEE
that would achieve those objectives. They used
criteria like feasibility, potential for delaying or
disrupting the JTF mission, and risk of exposure to
prune their suite of potential attack trees.
The Red Team proposed attacks outside the JTF
defended networks, such as on the CONUS air bases
and en route facilities, as did the Novice Blue Team.
However, since the Expert Blue Team's defense plan
only defended the AOC and WOC, the set of Red
Team attacks used in the assessment was pared to the
twelve attacks where all three blue teams could be
compared. Those attacks are summarized in Table 2.
The Red Team expended about 120 labor hours in
developing their attacks, independent of the time
spent during assessment.
3.5 Assessment Team Process and Outputs
The Assessment Team consisted of the commercial
network security expert from the Expert Blue Team, a
member of the Novice Blue Team, a Red Team
member, and two Experiment Working Group
members who were neutral and operating in an
experimentation oversight role. This team met for
three days and walked through twelve individual Red
Team attacks for each of the three Blue Team
strategies. The twelve attacks were selected based on
a fair representation of the variety of attacks
presented, and the variety of targets presented, with
consideration going to minimizing any repetition of
the same attack path or target. Scores were assigned
by each Assessment Team member for each attack,
each Blue Team strategy, and each of the four
categories (prevent, detect, predict target, predict
COA).
Representation from each of the three Blue Teams
and the Red Team was critical to the success of the
assessment phase.
Team members provided
necessary clarity of intent, strategy, and decisions
throughout the assessment. Each of the five team
members maintained their own separate score sheet
and decided upon their own rankings. The results are
presented in Section 4. Again, it should be noted that
the predicted target and COA values for the
Combined Blue Team were carried over directly from
the scores of the Novice Blue Team as the predictions
were also carried over directly.
4. Data and Results
Following the Experiment Execution phase, the
members of the Assessment Team subjectively eval-
uated and scored each Blue Teams’ success in detecting
attacks, preventing attacks, predicting adversary targets
and predicting adversary courses of action. The possible
scores were as follow:
Ineffective
Slightly Ineffective
Moderately Effective
Effective
Highly Effective
1.0
2.0
3.0
4.0
5.0
The IPIB process postulated a number of kinetic/cyber
events that the Red Team could have launched but did not
due to experiment limitations. Predicted kinetic/cyber
events included deception and intrusion of radio and
satellite communications data and voice links critical to
blue force deployment. IPIB also postulated Red Team
Psychological Operations and disinformation based on
kinetic and cyber environment/tools and techniques, in
particular, web-based attacks on networks related to the
defended network but not under the control of the JTF
Commander. The Novice Blue Team issued simulated
warnings and recommendations to the other organizations
describing expected enemy actions based on their set of
attack trees.
The Red Team also proposed external attacks on other
units. However, the Expert Blue Team did not prepare any
defenses of the CONUS Air Bases, en route, or on related
networks – only the AOC and WOC. To allow a head-tohead comparison of the three Blue Teams, the Assessment
Team agreed to consider only those attacks where both the
Novice Blue Team and Expert Blue Team had proposed
defenses, so external attacks were not considered.
The average scores across all twelve attacks for each of
the three blue teams in each of the four categories are
provided in Table 3, and graphically in Figure 7.
Table 3: Average assessment of Blue Team
capabilities across all twelve Red Team
attacks.
Detect
adversary
Prevent
adversary
Predict RT
COA
Predict
Target
Overall
Average
Proceedings of the DARPA Information Survivability Conference and Exposition (DISCEXII’01)
0-7695-1212-7/01 $10.00 © 2001 IEEE
Expert
BT
2.0
Novice
BT
2.9
Combine
d BT
3.0
3.2
2.6
3.4
3.0
3.7
3.7
3.9
4.4
4.4
3.0
3.4
3.6
Average Scores
PLANT TROJANS
2
5
1.5
4.5
4
3.5
Expert
3
IPIB
2.5
2
Combined
1.5
1
0.5
0
Detect Prevent Predict Predict
RT CoA Target
1
0.5
0
Detect
Prevent
Target
CoA
Figure 7. Average scores across all attacks
Generally speaking, in all categories except
prevent, the IPIB process showed improvements over
the standard best practices employed by the Expert
Blue Team.
One should be aware that the detection results are
inconclusive. Credit for detect was given solely on
placement of detector, without consideration of
probabilities of detection or false alarm rates,
although these are known to be less than satisfactory
in the real world. The Expert Blue Team focused on
prevention at the sake of detection due to their
knowledge of the current state of the art in intrusion
detectors. Although this does not affect the results
with respect to the main hypothesis, to obtain realistic
results on detection, the experiment should be run on
a real network with real IDSs.
The twelve attack paths were grouped into five
generalized attack categories. The intent was to look
for patterns in Blue Team data and results based on
general categories of attacks; to analyze data from the
attack class perspective. Originally a lifecycle attack
category was included, however the Novice and
Combined IPIB Blue Teams simply forgot to include
lifecycle attacks in their analysis, so the scores were
not representative of the IPIB performance. Only one
attack was executed by the Red Team, and the results
were determined to be statistically inconclusive and
misleading. The results were therefore not analyzed
from the attack type perspective. The summary of
attack types is presented in Table 4. Attack path
types correspond to the left hand “ID” column in
Table 2.
Table 4: Summary of Attack Types
Attack Paths
Attack Categories
Plant Trojans
Denial of Service (DoS)
Cyber Only
Cyber + Human
Attack External to LAN
B, D, G, I, K, L
F
B, G, H, K
A, C, D, E, I, J, L
B, F
DENIAL OF SERVICE
2.5
Expert
2
IPIB
1.5
Combined
1
0.5
0
2
Detect
Prevent Predict
RT CoA
Predict
Target
CYBER ONLY
1.5
1
0.5
0
2
Detect Prevent Predict Predict
RT CoA Target
CYBER/HUMAN
1.5
1
0.5
0
Detect Prevent Predict Predict
RT CoA Target
2
ATTACK EXTERNAL TO LAN
1.5
1
0.5
0
Detect Prevent Predict Predict
RT CoA Target
Figure 8. Performance By Attack Type
The following data analysis presents normalized data,
using the Expert Blue Team scores as the nominal value
(1.0). This is based on the underlying assumption that the
Expert Blue Team represents current security best
Proceedings of the DARPA Information Survivability Conference and Exposition (DISCEXII’01)
0-7695-1212-7/01 $10.00 © 2001 IEEE
practices and served as the baseline datum from which
the Novice and Combined Blue Teams were
compared.
Figure 8 displays the normalized scores for
detecting attacks, preventing attacks, predicting
adversary COAs, and predicting adversary target for
each of the three teams, by attack type. These scores
are the average of the scores for the multiple attack
paths comprising each attack type.
Table 5 summarizes the scores from Figure 7 by
attack type relative to the Expert Blue Team. This
table displays the overall team performance based on
type of attack through combining the scores
associated with detecting, preventing, predicting
adversary targets and predicting adversary COAs.
Table 5: Overall Performance of Each Team,
based on Attack Type
Attack Type
Expert IPIB Comb.
Plant Trojans
Denial of
Service
Cyber Only
Cyber/Human
Attack Ext. to
LAN
Total Score
•
•
•
detection, and procedural countermeasures for
prevention. The Expert Blue Team made no allowance
for the insider threat other than traditional placement of
firewalls, intrusion detection systems and virtual private
networks
IA experts, without the use of IPIB, tended to develop
"Maginot Line" defenses inside a Defended Network
and wait for attacks - there was little attempt to move the
Forward Edge of the Battle Area (FEBA) outside the
perimeter of the defended network. They gave virtually
no consideration to mission-critical components or
lethality of potential attacks.
The Expert Blue Team did a superior job of identifying
specific technology-based and network-based Standard
Practices and Procedures (SPP). The IPIB Blue Team
addressed SPPs more in generalities, but did include
SPPs for operational and administrative security.
When the two Blue teams were combined, essentially
initiating the Expert Blue Team into the IPIB process,
the result was an improved CDP. The members of the
Combined Blue Team also benefited from one another’s
perspective.
The Novice Blue Team expended significantly more
effort than the Expert Blue Team. However, their effort
was comparable to the Red Team effort.
4
4
4.81
4.91
5.09
4.84
4
4
4
4.16
5.08
5.12
4.97
5.22
4.99
•
20
24.09
25.11
Speaking specifically to the four sub-hypotheses:
Overall, as expected, the Combined
outperformed the Expert and Novice teams.
team
5. Conclusions
Overall, the main hypothesis was supported. The
IPIB process enabled a team of relative IA novices to
develop a Cyber Defense Plan that took a more holistic
view of the cyber threat and identified substantially
more Enemy COAs (Red Team attacks) than a team of
traditional IA experts.
The IPIB process provides actionable, predictive
information about probable adversary Courses of
Action prior to actual hostilities (Cyber Attacks). IPIB
provides the basis for a comprehensive Cyber Defense
Plan to include recommendations on sensor
placements, preferred defensive countermeasures and
tactics, policies and procedures to provide the broadest
range of coverage to address a spectrum of adversary
characteristics from novices to foreign nation-state
supported professionals.
Other general conclusions include:
• The Novice Blue Team explicitly identified insider
attacks, proposed "effects-based-observables" for
Detection: In each case, except life cycle attacks, the
IPIB process improved the detection ability of the network
defense strategy, subject to the experimental caveats that
detection results are subjective. The Novice Blue Team
admitted that they simply forgot life cycle attacks. The
IPIB process does help indicate what IDSs one would need
to acquire, and helps identify where IDS research should be
focused by naming observables or conditions for which
there are no IDSs.
Prevention: The IPIB process did assist the experts in
further improving their prevention capabilities. It can be
concluded that the IPIB process provides the expert with
additional information to consider when designing the
prevention portion of the network defense strategy.
The Novice Blue Team did not perform as well in
prevention, but this should be ascribed to lack of expertise
in Prevent mechanisms, rather than a weakness of the IPIB
process.
Target Prediction. The IPIB process clearly predicted
attack targets better than security best practices in all cases
but the denial of service attack types. The IPIB adversary
model judged DoS attacks to be lower probability until the
later stages of air operations because of the ease of
detection. For attacks external to the LAN, IPIB was
Proceedings of the DARPA Information Survivability Conference and Exposition (DISCEXII’01)
0-7695-1212-7/01 $10.00 © 2001 IEEE
superior to the security expert’s ability to predict the
attack types.
COA Prediction. In all cases but the strictly cyber
attacks, the IPIB process did a better job of predicting
the Red Team’s course of action than the security
experts did. We judge that this structured process
encourages the human to think through all aspects of
hostilities in a methodical way which allows him or her
to identify potential COAs more accurately. IPIB takes
a much more comprehensive view of adversary goals,
vulnerabilities and attack axes to determine EnCOAs.
According to the Red Team leader, IPIB predicted the
same types and categories of attacks against the
military and civilian cyber and kinetic infrastructure
that they would have used if permitted.
These results indicate that IPIB provides substantial
improvements to current network security best
practices. It is not, however, a silver bullet, but must
be used in concert with conventional IA measures. The
Red Team has requested to be a transition customer for
the IPIB prototype - solid evidence that the IPIB
process has utility, and could be adapted for offensive
information operations.
The process forces
examination of topics and elements that traditional
cyber defense experts tend to rule "out of bounds" - but
that even a relatively friendly Red Team , much less a
truly hostile adversary, does not.
6. Lessons Learned
There are two categories of lessons learned: (1)
about the conduct of this type of experiment and (2)
lessons for the IA community at large. The first set was
captured during the CC20008 Experiment Hot Wash
conducted immediately following the conclusion of the
execution phase and throughout the Assessment phase.
• Process based experiments are complicated to
develop, score and assess. The scenario was
complex, but if it is restricted too much, the
attack set becomes trivial.
• Process-based experiments rely more on human
interaction and decision making.
• A common understanding of the information
infrastructure is necessary. Each team needs to
state assumptions early on in the experiment.
• Although the results were encouraging, the attack
set was very limited. It would be beneficial to run
the experiment with many more attacks of various
types.
• The evaluations were subjective.
The
composition of the Assessment Team should have
•
•
compensated for this, but running real attacks against
a real systems would likely provide more accurate
results. It was critical to have Red and Blue Team
members present during assessment phase.
Assessment team composition is a key variable to
understanding varied approaches.
An attempt was made to get military defenders for the
Expert Blue Team, but due to the press of world events
they were not available. Instead, we used commercial
experts and commercial best practices. It would be
worthwhile to run this, or a similar experiment, with
military defenders on a real network with real attacks
and countermeasures.
The second set of lessons learned is applicable to all of
us involved in developing network defenses, and the IA
community at large:
•
•
•
•
•
IPIB helps by forcing defense strategy development
into a process that takes a more complete view of the
battlespace. Considering the overall environment and
modeling the adversary (IPIB Steps 1 and 3) markedly
benefited the novice blue team.
IA-related intelligence databases do not exist in
usable form--that is, in a common, searchable,
processable structure that contains data elements
similar to kinetic warfare intelligence databases. To
use the intelligence vernacular, there is no collection
and delegated production infrastructure to populate
and maintain these databases.
Defenders must move the FEBA outside their defense
perimeter – this means we as a community need to
develop forms of layered defenses outside the
firewall.
No matter how much you protect yourself, the cyber
world depends on humans to assist. In other words, a
complete cyber defense requires operational, physical,
and administrative security as well, and not just
technical IA measures.
IPIB is a data and labor-intensive process. Automation
must be used to reduce the data collection and
organization effort so that human resources can be
focused on higher-level analytical functions.
Fortunately, much of the information collection
required for IPIB is already being performed in order
to design and constitute a defended network – Annex K
of a joint military operations order is an example.
Other venues must perform similar planning, whether
there is IPIB or not, to field their information systems.
What we, the IA community, need to do is to ensure
that this kind of data preparation is expressed in a
Proceedings of the DARPA Information Survivability Conference and Exposition (DISCEXII’01)
0-7695-1212-7/01 $10.00 © 2001 IEEE
•
•
•
•
usable, shareable form, such as XML, so that IPIB
applications can import it for analytical use.
IPB for cyberspace is a relatively new concept.
This experiment has demonstrated its potential
efficacy. The forthcoming DARPA prototype will
allow users to evaluate it for operational use.
However, we have found that it spans an entire
organization, not just the Intelligence, Computers
and Communications personnel. In military terms,
it spans the entire J-Staff. Staffs need to start
thinking about the roles and responsibilities of the
Operations,
Planning,
Logistics,
and
Administration organizations as well as preparing
draft Concepts of Operations for a complete cyber
defense.
Advanced adversaries (cyber-terrorists, criminals,
enemy nation-states) attempt to avoid premature
detection. This means they will tend to co-opt
insiders or masquerade as someone they are not,
such as a cleared user. Therefore, we must devote
more resources to countering the hostile insider or
equivalent insider. If we cannot prevent misuse,
we clearly must find ways to detect it, or its
effects, and establish processes and procedures
that allow us to respond and recover.
Attack trees can be reused, and become an
extensible checklist for subsequent IPIB analysis.
Not only are they useful within IPIB against the
same threat, they can be reused against a different
threat in the same or a different theater. Thus, a
repository of attack trees becomes a productivity
aid, and also helps overcome the novice factor,
such as forgetting life-cycle attacks and relegating
DoS attacks to lower probability.
One of the differences observed in this experiment
was in viewpoints. The Expert Blue Team had the
view of a senior, experienced systems
administrator, while the IPIB process forced the
more complete view of the Commander. We have
observed this disparity in viewpoints in other
experiments as well.
We judge that it is
imperative for network defenders to adopt a
higher-level view, or aggressors will always have
the advantage.
•
•
Network defenders cannot ignore the mission.
Adversaries with operational goals will tend to attack
targets that have an adverse affect on the strategic,
operational, or tactical mission the defended network is
supporting. IPIB forces analysts to consider mission
and potential lethality of attacks in prioritizing enemy
COAs and attack targets.
Red Teams and advanced adversaries are using some
form of IPIB against you.
BIBLIOGRAPHY
1) US Army Field Manual 34-130, "Intelligence Preparation of
the Battlefield"
2) JCS Pub 3-13, Joint Doctrine for Information Operations
3) JCS Pub 2-01.3, Joint Intelligence Preparation of the
Battlespace (JIPB)
4) Williams, J.K.; Moore, R.A.; McCain, C.M.; and Grant, J.F.
"Intelligence Preparation of the Information Battlespace: An
Information Survivability Opportunity", Proceedings, MilCom
2000, Los Angeles, October, 2000
5) Moore, R.A., Williams, J. K., and McCain, C.M.
"Intelligence Preparation of the Information Battlespace:
A Cyber Playbook for Information Survivability",
Proceedings, Information Survivability Workshop 2000,
Boston, October, 2000
6) The DARPA Third Generation Security DocuShare web site
(https://archive.ia.isotic.org/), including the DARPA Cyber
Panel program Cyber Command and Control (CC2)
Experimentation repository for the CC20008 experiment
https://archive.ia.isotic.org/dscgi/ds.py/View/Collection-366
7) Wood, Bradley and Schudel, Gregg. "Modeling Behavior of
the Cyber Terrorist",
Countering Cyber-Terrorism
Workshop, June 22-23, 1999, University of Southern
California, Information Sciences Institute, Marina del Ray,
CA. Proceedings available at http://www.isi.edu/cctws or
via email from Clifford Neuman (bcn@isi.edu)
8) The
Zel
Technologies
IA
web
site
http://projects.zeltech.com/ia/.
9) Schneier, Bruce, Attack Trees, Dr. Dobb's Journal, v. 24, n. 12,
Dec
1999,
pp.
21-29.
Also
available
at
http://www.counterpane.com/attacktrees-ddj-ft.html.
Proceedings of the DARPA Information Survivability Conference and Exposition (DISCEXII’01)
0-7695-1212-7/01 $10.00 © 2001 IEEE
View publication stats