For Crowdsourcing: With Virtual Environments TO
For Crowdsourcing: With Virtual Environments TO
For Crowdsourcing: With Virtual Environments TO
for Crowdsourcing
TO UNLOCK INNOVATION
A host of technologies and concepts holds the key for reducing devel
opment time linked to real warfighter evaluation and need. Innovations in
MBSE, networking, and virtual environment technology can enable collabo
ration among the designers, developers, and end users, and can increasingly
be utilized for warfighter crowdsourcing (Smith & Vogt, 2014). The inno
vative process can link ideas generated by warfighters, using game-based
virtual environments in combination with the ideas, ranking, and filtering
of the greater engineering staff. The DoD, following industry’s lead in crowd-
sourcing, can utilize the critical success factors and methods developed in
this research to reduce the time needed to develop and field critical defense
systems. Innovative use of virtual environments and crowdsourcing can
increase the usefulness of weapon systems to meet the real needs of the true
stakeholders—the warfighters.
The DoD, as a whole, has begun looking for efficiency by employing inno
vation, crowdsourcing, MBSE, and virtual environments (Zimmerman,
2015). Industry has led the way with innovative use of crowdsourcing for
design and idea generation. Many of these methods utilize the public at
large. However, this study will focus on crowdsourcing that uses warfight
ers and the larger DoD engineering staff, along with MBSE methodologies.
This study focuses on finding the critical success factors, or key elements,
and developing a process (framework) to allow virtual environments and
MBSE to continually produce feedback from key stakeholders throughout
the design cycle, not just at the beginning and end of the process. The pro
posed process has been developed based on feedback from a panel of experts
using the Delphi method. The Delphi method, created by RAND in the 1950s,
allows for exploration of solutions based on expert opinion (Dalkey, 1967).
This study utilized a panel of 20 experts in modeling and simulation (M&S).
The panel was a cross section of Senior Executive Service, senior Army,
Navy, and DoD engineering staff, and academics with experience across the
range of virtual environments, M&S, MBSE, and human systems integra
tion (HSI). The panel developed critical success factors in each of the five
areas explored: MBSE, HSI, virtual environments, crowdsourcing, and the
overall process. HSI is an important part of the study because virtual envi
ronments can enable earlier detailed evaluation of warfighter integration
in the system design.
Many researchers have conducted studies that looked for methods to make
military systems design and acquisition more fruitful. A multitude of stud
ies conducted by the U.S. Government Accountability Office (GAO) has also
investigated the failures of the DoD to move defense systems from the early
stages of conceptualization to finished designs useful to warfighters. The
The DoD is now looking to follow the innovation process emerging in indus
try to kick-start the innovation cycle and utilize emerging technologies to
minimize the time from initial concept to fielded system (Hagel, 2014). This
is a challenging goal that may require significant review and restructuring
of many aspects of the current process. In his article “Digital Pentagon,”
Modigliani (2013) recommended a variety of changes, including changes
to enhance collaboration and innovation. Process changes and initiatives
have been a constant in DoD acquisition for the last 25 years. As weapons
have become more complex, software-intensive, and interconnected, DoD
has struggled to find the correct mix of process and innovation. The DoD
acquisition policy encourages and mandates the utilization of systems
engineering methods to design and develop complex defense systems. It is
hoped that the emergence of MBSE concepts may provide a solid foundation
and useful techniques that can be applied to harness and focus the fruits of
the rapidly expanding innovation pipeline.
The goal and desire to include more M&S into defense system design and
development has continually increased as computer power and software
tools have become more powerful. Over the past 25 years, many new efforts
have been launched to focus the utilization of advanced M&S. The advances
in M&S have led to success in small pockets and in selected design efforts,
but have not diffused fully across the entire enterprise. Several different
process initiatives have been attempted over the last 30 years. The acquisi
tion enterprise is responsible for the process, which takes ideas for defense
systems; initiates programs to design, develop, and test a system; and then
manages the program until the defense system is in the warfighters’ hands.
A few examples of noteworthy process initiatives are Simulation Based
Acquisition (SBA); Simulation and Modeling for Acquisition, Requirements,
and Training (SMART); Integrated Product and Process Development
(IPPD); and now, Model Based Systems Engineering (MBSE) and Digital
Engineering Design (DED) (Bianca, 2000; Murray, 2014; Sanders, 1997;
Zimmerman, 2015). These process initiatives (SBA, SMART, and IPPD)
helped create some great successes in DoD weapon systems; however, the
record of defense acquisition and the amount of time required to develop
more advanced and increasingly complex interoperable weapon systems
has been mixed at best. The emerging MBSE and DED efforts are too new
to fully evaluate their contribution.
338
Defense
ARJ, April 2017, Vol. 24 No. 2 : 334–367
April 2017
to the warfighter (David, 1995; Lyons, Long, & Chait, 2006). This is a rare
example of warfighter input and unique M&S efforts leading to a successful
program. In contrast to Javelin’s successful use of innovative modeling and
simulation is the Army’s development of Military Operations on Urbanized
Terrain (MOUT) weapons. In design for 20 years, and still under develop
ment, is a new urban shoulder-launched munition for MOUT application
now called the Individual Assault Munition (IAM). The MOUT weapon
acquisition failure was in part due to challenging requirements. However,
the complex competing technical system requirements might benefit from
the use of detailed virtual prototypes and innovative game-based war-
BACKGROUND
Literature Review
This article builds upon detailed research by Murray (2014); Smith and
Vogt (2014); London (2012); Korfiatis, Cloutier, and Zigh (2015); Corns and
Kande (2011); and Madni (2015) that covered elements of crowdsourcing,
virtual environments, gaming, early systems engineering, and MBSE. The
research study described in this article was intended to expand the work
discussed in this section and determine the critical success factors for using
MBSE and virtual environments to harvest crowdsourcing data from war-
fighters and stakeholders, and then provide that data to the overall Digital
System Model (DSM). The works reviewed in this section address virtual
environments and prototyping, MBSE, and crowdsourcing. The majority
of these are focused on the conceptualization phase of product design.
However, these tools can be used for early product design and integrated
into the detailed development phase up to Milestone C, the production and
deployment decision.
Many commercial firms and some government agencies have studied the
use of virtual environments and gaming to create “serious games” that
have a purpose beyond entertainment (National Research Council [NRC],
2010). Commercial firms and DARPA have produced studies and programs
to utilize an open innovation paradigm. General Electric, for one, is com
mitted to “crowdsourcing innovation—both internally and externally …
[b]y sourcing and supporting innovative ideas, wherever they might come
from…” (General Electric, 2017, p. 1).
Researchers from many academic institutions are also working with open
innovation concepts and leveraging input from large groups for concept
creation and research into specific topics. Dr. Stephen Mitroff of The George
Washington University created a popular game while at Duke University
that was artfully crafted not only to be entertaining, but also to provide
researchers access to a large pool of research subjects. Figure 1 shows a
sample game screen. The game allows players to detect dangerous items
from images created to look like a modern airport X-ray scan. The research
utilized the game results to test hypotheses related to how the human brain
detects multiple items after finding similar items. In addition, the game
allowed testing on how humans detect very rare and dangerous items. The
game platform allowed for a large cross section of the population to interact
and assist in the research, all while having fun. One of the keys to the use
fulness of this game as a research platform is the ability to “phone home”
or telemeter the details of the player-game interactions (Drucker, 2014;
Sheridan, 2015). This research showed the promise of generating design
and evaluation data from a diverse crowd of participants using game-based
methods.
Process
Several examples of process-related research that illustrates begin
ning inquiry into the use of virtual environments and MBSE to enhance
systems development are reviewed in this section. Marine Corps Major
Kate Murray (2014) explored the data that can be gained by the use of a
conceptual Early Synthetic Prototype (ESP) environment. The envisioned
environment used game-based tools to explore requirements early in the
design process. The focus of her study was “What feedback can be gleaned,
and is it useful to decision makers?” (Murray, 2014, p. 4). This innovative
thesis ties together major concepts needed to create an exploration of design
within a game-based framework. The study concludes that ESP should be
utilized for Pre-Milestone A efforts. The Pre-Milestone A efforts are domi
nated by concept development and materiel solutions analysis. Murray also
discussed many of the barriers to fully enabling the conceptual vision that
she described. Such an ambitious project would require the warfighters to be
able to craft their own scenarios and add novel capabilities. An interesting
viewpoint discussed in this research is that the environment must be able to
interest the warfighters enough to have them volunteer their game-playing
time to assist in the design efforts. The practical translation of this is that
the environment created must look and feel like similar games played by the
warfighters, both in graphic detail and in terms of game challenges to “keep
… players engaged” (Murray, 2014, p. 25).
Corns and Kande (2011) describe a virtual engineering tool from the
University of Iowa, VE-Suite. This tool utilizes a novel architecture, includ
ing a virtual environment. Three main engines interact: an Xplorer, a
Conductor, and a Computational engine. In this effort, Systems Modeling
Language (SysML) and Unified Modeling Language (UML) diagrams are
integrated into the overall process. A sample environment is depicted sim
ulating a fermentor and displaying a virtual prototype of the fermentation
process controlled by a user interface (Corns & Kande, 2011). The extent
and timing of the creation of detailed MBSE artifacts, and the amount of
integration achievable or even desirable among specific types of modeling
languages—e.g., SysML and UML—are important areas of study.
Crowdsourcing
Wired magazine editors Jeff Howe and Mark Robinson coined the
term “crowdsourcing” in 2005. In his Wired article titled “The Rise of
Crowdsourcing,” Howe (2006) described several types of crowdsourcing.
The working definition for this effort is "… the practice of obtaining needed
services, ideas, design, or content by soliciting contributions from a large
group of people and especially from the system stakeholders and users
rather than only from traditional employees, designers, or management"
(Crowdsourcing, n.d.).
The best fit for crowdsourcing, conceptually, for this current research proj
ect is the description of research and development (R&D) firms utilizing the
InnoCentive Website to gain insights from beyond their in-house R&D team.
A vital feature in all of the approaches is the use of the Internet and modern
computational environments to find needed solutions or content, using the
In 2015, the U.S. Navy launched “Hatch.” The Navy calls this portal a
“crowdsourced, ideation platform” (Department of the Navy, 2015). Hatch
is part of a broader concept called the Navy Innovation Network (Forrester,
2015; Roberts, 2015). With this effort, the Navy hopes to build a continuous
process of innovation and minimize the barriers for information flow to help
overcome future challenges. Novel wargaming and innovation pathways are
to become the norm, not the exception. The final tools that will fall under
this portal are still being developed. However, it appears that the Navy has
taken a significant step foward to establish structural changes that will
simplify the ideation and innovation pipeline, and ensure that the Navy uses
all of the strengths of the total workforce. “Crowdsourcing, in all of its forms,
is emerging as a powerful tool…. Organizational leaders should take every
opportunity to examine and use the various methods for crowdsourcing at
every phase of their thinking” (Secretary of the Navy, 2015, p. 7).
The U.S. Air Force has also been exploring various crowdsourcing concepts.
They have introduced the Air Force Collaboratory Website and held a num
ber of challenges and projects centered around three different technology
areas. Recently, the U.S. Air Force opened a challenge prize on its new
Website, http://www.airforceprize.com, with the goal of crowdsourcing
a design concept for novel turbine engines that meet established design
requirements and can pass the validation tests designed by the Air Force
(U.S. Air Force, n.d.; U.S. Air Force, 2015).
As the DoD looks to use MBSE concepts, new versions of the DoD Instruction
5000.02 and new definitions have emerged. These concepts and definitions
can assist in developing and providing the policy language to fully utilize
an MBSE-based process. The Office of the Deputy Secretary of Defense,
Systems Engineering is working to advance several new approaches related
to MBSE. New definitions have been proposed for Digital Threads and DED,
using a DSM. The challenges of training the workforce and finding the cor
rect proof-of-principle programs are being addressed (Zimmerman, 2015).
These emerging concepts can help enable evolutionary change in the way
DoD systems are developed and designed.
SELECTED VIRTUAL
ENVIRONMENT ACTIVITIES
Army
Within the Army, several efforts are underway to work on various
aspects of virtual environments/synthetic environments that are import
ant to the Army and to this research. Currently, efforts are being funded
by the DoD at Army Capability Integration Center (ARCIC), Institute for
Creative Technologies (ICT) at University of Southern California, Naval
Postgraduate School (NPS), and at the AMRDEC. The ESP efforts managed
by Army Lieutenant Colonel Vogt continue to look at building a persistent,
game-based virtual environment that can involve warfighters voluntarily in
design and ideation (Tadjdeh, 2014). Several prototype efforts are underway
at ICT and NPS to help evolve a system that can provide feedback from the
warfighters, playing game-based virtual environments that answer real
design and strategy questions. Key questions being looked at include: what
metrics to utilize, how to distribute the games, and whether the needed
data can be saved and transmitted to the design team. Initial prototype
environments have been built and tested. The ongoing work also looks at
technologies that could enable more insight into the HSI issues by attempt
ing to gather warfighter intent from sensors or camera data relayed to the
ICT team (Spicer et al., 2015).
The “Always ON-ON Demand” efforts being managed by Dr. Nancy Bucher
(AMRDEC) and Dr. Christina Bouwens are a larger effort looking to tie
together multiple simulations and produce an “ON-Demand” enterprise
repository. The persistent nature of the testbed and the utilization of vir
tual environment tools, including the Navy-developed Simulation Display
System (SIMDIS™) tool, which utilizes the OpenSceneGraph capability,
offers exploration of many needed elements required to utilize virtual envi
ronments in the acquisition process (Bucher & Bouwens, 2013; U.S. Naval
Research Laboratory, n.d.).
Navy
Massive Multiplayer Online War Game Leveraging the Internet
(MMOWGLI) is an online strategy and innovation game employed by the
U.S. Navy to tap the power of the “crowd.” It was jointly developed by the
NPS and the Institute for the Future. Navy researchers developed the mes
sage-based game in 2011 to explore issues critical to the U.S. Navy of the
future. The game is played based on specific topics and scenarios. Some of
the games are open to the public and some are more restrictive. The way to
score points and “win” the game is to offer ideas that other players comment
upon, build new ideas upon, or modify. Part of the premise of the approach
is based on this statement: “The combined intelligence of our people is an
unharnessed pool of potential, waiting to be tapped” (Moore, 2014, p. 3).
Utilizing nontraditional sources of information and leveraging the rapidly
expanding network and visualization environment are key elements that
can transform the current traditional pace of design and acquisition. In
the future, it might be possible to tie this tool to more highly detailed vir
tual environments and models that could expand the impact of the overall
scenarios explored and the ideas generated.
RESEARCH QUESTIONS
The literature review demonstrates that active research is ongoing into
crowdsourcing, MBSE, and virtual environments. However, there is not a
fully developed process model and an understanding of the key elements that
will provide the DoD a method to fully apply these innovations to successful
system design and development. The primary research questions that this
study examined to meet this need are:
• What are the critical success factors that enable game-based
virtual environments to crowdsource design and requirements
information from warfighters (stakeholders)?
seek additional input from members of the panel. A large number of possi
ble critical success factors emerged for each focus area. Figure 2 shows the
demographics of the expert panel (n=20). More than half (55 percent) of the
panel have Doctoral degrees, and an additional 35 percent hold Master’s
degrees. Figure 2 also shows the self-ranked expertise of the panel. All have
interacted with the defense acquisition community. The panel has the most
experience in M&S, followed by expertise in virtual environments, MBSE,
HSI, and crowdsourcing. Figure 3 depicts a word cloud; this figure was
created from the content provided by the experts in the interview survey.
The large text items show the factors that were mentioned most often in
the interview survey. The initial list of 181 possible critical success factors
was collected from the survey, with redundant content grouped or restated
for each major topic area when developing the Delphi Round 1 survey. The
expert panel was asked to rank the factors using a 5-element Likert scale
from Strongly Oppose to Strongly Agree. The experts were also asked to
rank their or their groups’ status in that research area, ranging from “inno
vators” to “laggards” for later statistical analysis.
Degrees M & S VE
Bachelors Medium
10% 5%
Medium
25%
Masters PhD
35% 55% High High
95% 75%
350
Defense
ARJ, April 2017, Vol. 24 No. 2 : 334–367
April 2017
Fifteen experts participated in the Round 1 Delphi study. The data generated
were coded and statistical data were also computed. Figure 4 shows the top
10 factors in each of four areas developed in Round 1—virtual environments,
crowdsourcing, MBSE, and HSI. The mean, Interquartile Range (IQR), and
percent agreement are shown for 10 factors developed in Round 1.
The Round 2 survey included bar graphs with the statistics summarizing
Round 1. The Round 2 survey contained the top 10 critical success factors
in the five areas—with the exception of the overall process model, which
contained a few additional possible critical success factors due to survey
software error. The Round 2 survey shows an expanded Likert scale with
seven levels, ranging from Strongly Disagree to Strongly Agree. The addi
tional choices were intended to minimize ties and to help show where the
experts strongly ranked the factors.
Fifteen experts responded to the Round 2 survey, rating the critical success
factors determined from Round 1. The Round 2 survey critical success
factors continued to receive a large percentage of experts choosing survey
values ranging from “Somewhat Agree” to “Strongly Agree,” which con
firmed the Round 1 top selections. But Round 2 data also suffered from an
increase in “Neither Agree nor Disagree” responses for success factors past
the middle of the survey.
VIRTUAL ENVIRONMENTS
CRITICAL SUCCESS FACTOR MEAN IQR % AGREE
Real Time Operation 4.67 1 93%
Utility to Stakeholders 4.47 1 93%
Fidelity of Modeling/Accuracy of Representation 4.40 1 87%
Usability/Ease of Use 4.40 1 93%
Data Recording 4.27 1 87%
Verification, Validation and Accreditation 4.20 1 87%
Realistic Physics 4.20 1 80%
Virtual Environment Link to Problem Space 4.20 1 80%
Flexibility/Customization/Modularity 4.07 1 80%
Return On Investment/Cost Savings 4.07 1 87%
CROWDSOURCING
CRITICAL SUCCESS FACTOR MEAN IQR % AGREE
Accessibility/Availability 4.53 1 93%
Leadership Support/Commitment 4.53 1 80%
Ability to Measure Design Improvement 4.47 1 93%
Results Analysis by Class of Stakeholder 4.33 1 93%
Data Pedigree 4.20 1 87%
Timely Feedback 4.20 1 93%
Configuration Control 4.13 1 87%
Engaging 4.13 1 80%
Mission Space Characterization 4.13 1 87%
Portal/Web site/Collaboration Area 4.07 1 87%
MBSE
CRITICAL SUCCESS FACTOR MEAN IQR % AGREE
Conceptual Model of the Systems 4.60 1 87%
Tied to Mission Tasks 4.43 1 93%
Leadership Commitment 4.40 1 80%
Reliability/Repeatability 4.33 1 93%
Senior Engineer Commitment 4.33 1 80%
Fidelity/Representation of True Systems 4.27 1 93%
Tied To Measures of Performance 4.27 1 87%
Validation 4.27 1 93%
Well Defined Metrics 4.27 1 80%
Adequate Funding of Tools 4.20 2 73%
HSI
CRITICAL SUCCESS FACTOR MEAN IQR % AGREE
Ability to Capture Human Performance Behavior 4.64 1 100%
Adequate Funding 4.57 1 100%
Ability to Measure Design Improvement 4.43 1 93%
Ability to Analyze Mental Tasks 4.36 1 100%
Integration with Systems Engineering Process 4.33 1 87%
Leadership Support/Commitment 4.29 1.25 79%
Intuitive Interfaces 4.29 1.25 79%
Consistency with Operational Requirements 4.27 1 93%
Data Capture into Metrics 4.21 1 86%
Fidelity 4.14 1 86%
The Round 3 survey included the summary statistics from Round 2 and
charts showing the experts’ agreement from Round 2. The Round 3 ques
tions presented the top 10 critical success factors in each area and asked
the experts to rank these factors. The objective of the Round 3 survey was
to determine if the experts had achieved a level of consensus regarding the
ranking of the top 10 factors from the previous round.
Process (Framework)
“For any crowdsourcing endeavor to be successful, there has to be a
good feedback loop,” said Maura Sullivan, chief of Strategy and Innovation,
U.S. Navy (Versprille, 2015, p. 12). Figure 5 illustrates a top-level view
of the framework generated by this research. Comments and discussion
from the interview phase have been combined with the literature review
data and information to create this process. Key elements from the Delphi
study and the critical success factors have been utilized to shape this pro
cess. The fidelity of the models utilized would need to be controlled by the
visualization/modeling/prototyping centers. These centers would provide
key services to the warfighters and engineers to artfully create new game
elements representing future systems and concepts, and to pull information
from the enterprise repositories to add customizable game elements.
S&T Projects
MBSE Warfighter
& Ideas Ideation
nts Let
Decisio
me
MBS
ha
iron lity
els
Physics
E A
Env
n Engi
Mod
rtifa
cts
nes
Enterprise Repository/Digital System Models
Note. MBSE = Model Based Systems Engineering; S&T = Science and Technology;
SysML/UML = Systems Modeling Language/Unified Modeling Language.
The expert panel was asked: “Is Model Based Systems Engineering neces
sary in this approach?” The breakdown of responses revealed that 63 percent
responded “Strongly Agree,” another 18.5 percent selected “Somewhat
Agree,” and the remaining 18.5 percent answered “Neutral.” These results
show strong agreement with using MBSE methodologies and concepts as
an essential backbone, using MBSE as the “glue” to manage the use cases,
and subsequently providing the feedback loop to the DSM.
In the virtual environment results from Round 1, real time operation and
realistic physics were agreed upon by the panel as critical success factors.
The appropriate selection of simulation tools would be required to sup
port these factors. Scenegraphs and open-source game engines have been
evolving and maturing over the past 10 years. Many of these tools were
commercial products that had proprietary architectures or were expensive.
354
Defense
ARJ, April 2017, Vol. 24 No. 2 : 334–367
April 2017
In the MBSE results from Round 1, the panel indicated that both ties to
mission tasks and to measures of performance were critical. The selection
of metrics and the mechanisms to tie these factors into the process are very
important. Game-based metrics are appropriate, but these should be tied
to elemental capabilities. Army researchers have explored an area called
Degraded States for use in armor lethality (Comstock, 1991). The early work
in this area has not found wide application in the Army. However, the ele
mental capability methodology, which is used for personnel analysis, should
be explored for this application. Data can be presented to the warfighter that
aid gameplay by using basic physics. In later life-cycle stages, by capturing
and recording detailed data points, engineering-level simulations can be
run after the fact, rather than in real time, with more detailed high-fidelity
simulations by the engineering staff. This allows a detailed design based on
feedback telemetered from the warfighter. The combination of telemetry
from the gameplay and follow-up ranking by warfighters and engineering
staff can allow in-depth, high-fidelity information flow into the emerging
systems model. Figure 6 shows the authors’ views of the interactions and
fidelity changes over the system life cycle.
Early Con
cept
Warfighter Prototype
s Evaluatio
n
Warfighter EMD
Open Innovation s
Collaboration Competitive Warfighters
IDEATION
Design Features
Trade Study Evolving Representations Eng/Sci
Analysis of Alternatives Focused
Low Fidelity Eng/Sci
Comparative
Eng/Sci
Broad
Fidelity
Fidelity was ranked high in virtual environments, MBSE, and HSI.
Fidelity and accuracy of the modeling and representations to the true
system are critical success factors. For the virtual environment, early
work would be done with low facet count models featuring texture maps
for realism. However, as the system moves through the life cycle, higher
fidelity models and models that feed into detailed design simulations will
be required. There must also be verification, validation, and accreditation
of these models as they enter the modeling repository or the DSM.
Leadership Commitment
Leadership commitment was ranked near the top in the MBSE, crowd-
sourcing, and HSI areas. Clearly, in these emerging areas the enterprise
needs strong leadership and training to enable MBSE and crowdsourcing
initiatives. The newness of MBSE and crowdsourcing may be related to the
experts’ high ranking of the need for leadership and senior engineer commit
ment. Leadership support is also a critical success factor in Table 2—with
75 percent agreement from the panel. Leadership commitment and support,
although somewhat obvious as a success factor, may have been lacking in
previous initiatives. Leadership commitment needs to be reflected in both
policy and funding commitments from both DoD and Service leadership to
encourage and spur these innovative approaches.
viro s ou
Return on Investment/Cost Savings n d Mission Space Characterization
me n
t w Portal/Websit
Cro e/Collaboration Area
Conceptual Model of the Systems
Tied to Mission Tasks
Leadership Commitment
Reliability/Repeatability
Ability to Capture Human Performance Behavior
Data Security
Adequate Funding
Collabo
Ability to Measure Design Improvement ration Betwee
n Eng/Scientist & Warfighters
Ability to Analyze Mental Tasks Engagement (Warfighters)
Integration with Systems Engineering Process Engagemen
t (Scientists & Engineers)
Leadership Support/Commitment
Intuitive Interfaces
ments
FIGURE 7. CRITICAL SUCCESS FACTOR IN FIVE KEY AREAS
rational Require
Consistency with Ope
Data Capture In to M et rics
Fidelity
http://www.dau.mil
April 2017
The ideas presented here and the critical success factors have been
developed by a team of experts who have, on average, 20 to 30 years of expe
rience in the primary area of inquiry and advanced degrees. However, the
panel was more heavily weighted by Army experts than individuals from
the rest of the DoD. Neither time nor resources allowed for study of other
important groups of experts, including warfighters, industry experts, and
program managers. The Delphi method was selected for this study to gen
erate the critical success factors based on the perceived ease of use of the
method and the controlled feedback gathered. The critical success factors
developed are ranked judgment, but based on years of expertise. This study
considered five important areas and identified critical success factors in
those areas. This research study is based on the viewpoint of experts in
M&S. Nonetheless, other types of expert viewpoints might possibly gen
erate additional factors. Several factor areas could not be covered by M&S
experts, including security and information technology.
The surveys were constructed with 5- and 7- element Likert scales that
allowed the experts to choose “Neutral” or “Neither Agree nor Disagree.”
Not utilizing a forced-choice scale or a nonordinal data type in later Delphi
rounds can limit data aggregation and statistical analysis approaches.
RECOMMENDATIONS AND
CONCLUSIONS
MBSE artifacts that result from this process. An overall process can be
enacted that takes ideas, design alternatives, and data harvested—and
then provides a path to feed back this data at many stages in the acquisition
cycle. The extent to which MBSE tools such as SysML, UML, and emerging
new standards are adopted or utilized in the process may depend upon the
emerging training of acquisition professionals in MBSE and the leadership
commitment to this approach.
This article has answered the three research questions posed in earlier
discussion. Utilizing the expert panel, critical success factors have been
developed using the Delphi method. An emerging process model has been
described. Finally, the experts in this Delphi study have affirmed an essen
tial role of MBSE in this process.
FUTURE RESEARCH
The DoD is actively conducting research into the remaining challenges
to bring many of the concepts discussed in this article into the acquisition
process. The critical success factors developed here can be utilized to focus
some of the efforts.
360
Defense
ARJ, April 2017, Vol. 24 No. 2 : 334–367
April 2017
The DoD should support studies that select systems in the early stages of
development in each Service to apply the proposed framework and process.
The studies should use real gaps and requirements, and real warfighters. In
support of ARCIC, several studies are proposed at the ICT and the NPS that
explore various aspects of the challenges involved in testing tools needed
to advance key concepts discussed in this article. The Navy, Air Force, and
Army have active programs under various names to determine how M&S
can support future systems development as systems and designs become
more complex, distributed, and interconnected (Spicer et al., 2015).
When fully developed, MBSE and DSM methods can leverage the emerging
connected DoD enterprise and bring about a continuous-feedback design
environment. Applying the concepts developed in this article to assessments
conducted by developing concepts, Analysis of Alternatives, and trade
studies conducted during early development through Milestone C can lead
to more robust, resilient systems continuously reviewed and evaluated by
the stakeholders who truly matter: the warfighters.
References
Bianca, D. P. (2000). Simulation and modeling for acquisition, requirements, and
training (SMART) (Report No. ADA376362). Retrieved from http://oai.dtic.mil/
oai/oai?verb=getRecord&metadataPrefix=html&identifier=ADA376362
Boudreau, K. J., & Lakhani, K. R. (2013). Using the crowd as an innovation partner.
Harvard Business Review, 91(4), 60–69.
Bucher, N., & Bouwens, C. (2013). Always on–on demand: Supporting the
development, test, and training of operational networks & net-centric systems.
Presentation to National Defense Industrial Association 16th Annual Systems
Engineering Conference, October 28-31, Crystal City, VA. Retrieved from http://
www.dtic.mil/ndia/2013system/W16126_Bucher.pdf
Carlini, J. (2010). Rapid capability fielding toolbox study (Report No. ADA528118).
Retrieved from http://www.dtic.mil/dtic/tr/fulltext/u2/a528118.pdf
Comstock, G. R. (1991). The degraded states weapons research simulation: An
investigation of the degraded states vulnerability methodology in a combat
simulation (Report No. AMSAA-TR-495). Aberdeen Proving Ground, MD: U.S.
Army Materiel Systems Analysis Activity.
Corns, S., & Kande, A. (2011). Applying virtual engineering to model-based systems
engineering. Systems Research Forum, 5(2), 163–180.
Crowdsourcing (n.d.). In Merriam-Webster’s online dictionary. Retrieved from http://
www.merriam-webster.com/dictionary/crowdsourcing
Dalkey, N. C. (1967). Delphi (Report No. P-3704). Santa Monica, CA: The RAND
Corporation.
David, J. W. (1995). A comparative analysis of the acquisition strategies of Army
Tactical Missile System (ATACMS) and Javelin Medium Anti-armor Weapon
System (Master’s thesis), Naval Postgraduate School, Monterey, CA.
Department of the Navy. (2015, May 20). The Department of the Navy launches the
“Hatch.” Navy News Service. Retrieved from http://www.navy.mil/submit/display.
asp?story_id=87209
Drucker, C. (2014). Why airport scanners catch the water bottle but miss the
dynamite [Duke Research Blog]. Retrieved from https://sites.duke.edu/
dukeresearch/2014/11/24/why-airport-scanners-catch-the-water-bottle-but
miss-the-dynamite/
Ferrara, J. (1996). DoD's 5000 documents: Evolution and change in defense
acquisition policy (Report No. ADA487769). Retrieved from http://oai.dtic.mil/
oai/oai?verb=getRecord&metadataPrefix=html&identifier=ADA487769
Forrester, A. (2015). Ray Mabus: Navy’s ‘Hatch’ platform opens collaboration on
innovation. Retrieved from http://www.executivegov.com/2015/05/ray-mabus
navys-hatch-platform-opens-collaboration-on-innovation/
Freeman, G. R. (2011). Rapid/expedited systems engineering (Report No.
ADA589017). Wright-Patterson AFB. OH: Air Force Institute of Technology,
Center for Systems Engineering.
Gallop, D. (2015). Delphi, dice, and dominos. Defense AT&L, 44(6), 32–35. Retrieved
from http://dau.dodlive.mil/files/2015/10/Gallop.pdf
GAO. (2015). Defense acquisitions: Joint action needed by DOD and Congress to
improve outcomes (Report No. GAO-16-187T). Retrieved from http://www.gao.
gov/assets/680/673358.pdf
Author Biographies