Conceptual Modeling For Discrete-Event Simulation
Conceptual Modeling For Discrete-Event Simulation
0RGHOLQJIRU
'LVFUHWH(YHQW
6LPXODWLRQ
(GLWHGE\
6WHZDUW5RELQVRQ
8QLYHUVLW\RI :DUZLFN
&RYHQWU\8.
5RJHU%URRNV
8QLYHUVLW\RI /DQFDVWHU
8.
.DWK\.RWLDGLV
8QLYHUVLW\RI :DUZLFN
&RYHQWU\8.
'XUN-RXNHYDQGHU=HH
8QLYHUVLW\RI *URQLQJHQ
7KH1HWKHUODQGV
CRC Press
Taylor & Francis Group
6000 Broken Sound Parkway NW, Suite 300
Boca Raton, FL 33487-2742
This book contains information obtained from authentic and highly regarded sources. Reasonable efforts
have been made to publish reliable data and information, but the author and publisher cannot assume
responsibility for the validity of all materials or the consequences of their use. The authors and publishers
have attempted to trace the copyright holders of all material reproduced in this publication and apologize to
copyright holders if permission to publish in this form has not been obtained. If any copyright material has
not been acknowledged please write and let us know so we may rectify in any future reprint.
Except as permitted under U.S. Copyright Law, no part of this book may be reprinted, reproduced, transmit-
ted, or utilized in any form by any electronic, mechanical, or other means, now known or hereafter invented,
including photocopying, microfilming, and recording, or in any information storage or retrieval system,
without written permission from the publishers.
For permission to photocopy or use material electronically from this work, please access www.copyright.
com (http://www.copyright.com/) or contact the Copyright Clearance Center, Inc. (CCC), 222 Rosewood
Drive, Danvers, MA 01923, 978-750-8400. CCC is a not-for-profit organization that provides licenses and
registration for a variety of users. For organizations that have been granted a photocopy license by the CCC,
a separate system of payment has been arranged.
Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and are used
only for identification and explanation without intent to infringe.
Visit the Taylor & Francis Web site at
http://www.taylorandfrancis.com
Preface..................................................................................................................... vii
Editors.................................................................................................................... xiii
Contributors............................................................................................................xv
iii
iv Contents
Part V I Conclusion
Index...................................................................................................................... 491
Preface
vii
viii Preface
The purpose of this book is to build upon these efforts and to provide a
comprehensive view of the current state-of-the-art in conceptual modeling
for simulation. It achieves this by bringing together the work of an inter-
national group of researchers from different areas of simulation: military,
business, and health modeling. In doing this, we look at a range of issues in
conceptual modeling:
We cannot claim to fully answer any of these questions, but we are able to
present the latest thinking on these topics. The book is aimed at students,
researchers, and practitioners with an interest in conceptual modeling for
simulation. Indeed, we would argue that all simulation modelers have
an interest in conceptual modeling, because all simulation modelers are
involved, either consciously or subconsciously, in conceptual modeling.
The focus of the book is on discrete-event simulation (Pidd 2005; Law
2007), which for reasons of simplicity is described as just “simulation.” In
this approach, the dynamics of a system are modeled as a series of discrete
events at which the state of the system changes. It is primarily used for mod-
eling queuing systems that are prevalent in a vast array of applications in the
military, business, and health sectors. Despite this focus on discrete-event
simulation, many of the ideas will have wider applicability to other forms
of simulation (e.g., continuous simulation, system dynamics) and modeling
more generally.
In reading this book, it will become clear that there is no single agreed defi-
nition of a conceptual model or conceptual modeling. The chapters also pres-
ent some quite different perspectives on what conceptual modeling entails.
These differences are perhaps most stark between those working with mil-
itary models and those working in business and health. This is largely a
function of the scale and complexity of the models that the two groups work
with, military models generally being much larger in scale (Robinson 2002).
As editors of the book, we have made no attempt to reconcile these differ-
ences. The state-of-the-art is such that we are not yet in a position to propose
a unified definition of a conceptual model or a unified approach to concep-
tual modeling. Indeed, it seems unlikely that such a goal is achievable either
in the short term, given our limited understanding of conceptual modeling,
or even in the long term, given the range of domains over which simulation
is used and the complexity of the conceptual modeling task. What this book
does provide is a single source in which different perspectives on conceptual
modeling are presented and a basis upon which they can be compared.
Preface ix
The book is split into six parts, each focusing on a different aspect of concep-
tual modeling for simulation. Part I explores the foundations of conceptual
modeling. In Chapter 1, Robinson discusses the definition of a conceptual
model and conceptual modeling, the purpose and requirements of a con-
ceptual model, and the guidance that is given in the literature on conceptual
modeling. The ideas that are presented provide a backdrop for the rest of the
book. Brooks (Chapter 2) explores the relationship between the level of detail
and complexity of a model, and the performance of that model. He identifies
eleven “elements” that can be used for measuring model performance. In an
experiment, he investigates the relationship between complexity and model
performance. In Chapter 3, Wang and Brooks follow an expert modeler and
a number of novice modelers through the conceptual modeling process. As
a result, they are able to identify the process followed and differences in con-
centration on the various elements of conceptual modeling.
Part II includes five chapters on frameworks for conceptual modeling.
A framework provides a set of steps and tools that aim to help a modeler
through the process of deciding what model to build. Robinson (Chapter 4)
presents a framework for modeling operations systems, such as manufactur-
ing and service systems. Meanwhile, van der Zee (Chapter 5) concentrates
on conceptual modeling for manufacturing systems using an object-ori-
ented approach. The ABCmod conceptual modeling framework, devised by
Arbez and Birta (Chapter 6), provides a detailed procedure that is useful
for modeling discrete-event dynamic systems. In Chapter 7, Karagöz and
Demirörs describe and compare a series of conceptual modeling frame-
works (FEDEP, CMMS (FDMS), DCMF, Robinson’s framework, and KAMA)
most of which derive from simulation modeling in the military domain. In
the final chapter of part II, Haydon reflects upon his many years of experi-
ence in simulation modeling (Chapter 8). He describes how he approaches
conceptual modeling from a practical perspective by outlining a series of
steps that can be followed. This provides a valuable practice-based reflec-
tion on the topic.
Some authors have identified a correspondence between soft systems
methodology (SSM) (Checkland 1981) and conceptual modeling. Pidd and
Kotiadis discuss this connection in Part III. It is important to correctly iden-
tify the problem to be tackled, otherwise a simulation study is set for failure
from the outset; Balci (1994) identifies this as a type 0 error. As a result, Pidd
(Chapter 9) discusses problem structuring and how SSM can help a simula-
tion modeler ensure that the right problem is tackled. Kotiadis specifically
focuses on using SSM to help identify the objectives of a simulation study
and describes the approach through a case study on modeling community
health care (Chapter 10).
Part IV investigates the links between software engineering and con-
ceptual modeling; this might be described as “conceptual engineering.” In
Chapter 11, Liston et al. describe and illustrate the use of SysML as an aid
to conceptual modeling. Following a review of process modeling methods,
x Preface
Roger J. Brooks
Kathy Kotiadis
Stewart Robinson
Durk-Jouke van der Zee
References
Balci, O. 1994. Validation, verification, and testing techniques throughout the life
cycle of a simulation study. Annals of operations research 53: 121–173.
Preface xi
xiii
Contributors
xv
xvi Contributors
Foundations of
Conceptual Modeling
1
Conceptual Modeling for Simulation:
Definition and Requirements
Stewart Robinson
Contents
1.1 Introduction.....................................................................................................3
1.2 Example: Modeling the Ford Motor Company’s South Wales
Engine Assembly Plant..................................................................................5
1.3 What is Conceptual Modeling?....................................................................8
1.3.1 A Definition of a Conceptual Model.............................................. 10
1.3.2 Conceptual Modeling Defined....................................................... 14
1.4 The Purpose of a Conceptual Model......................................................... 14
1.5 Requirements of a Conceptual Model....................................................... 16
1.5.1 The Overarching Requirement: Keep the Model Simple............ 20
1.6 Guidance on Conceptual Modeling...........................................................22
1.6.1 Principles of Modeling.....................................................................22
1.6.2 Methods of Simplification............................................................... 23
1.6.3 Modeling Frameworks..................................................................... 24
1.7 Conclusion..................................................................................................... 25
Acknowledgments................................................................................................. 26
References................................................................................................................ 26
1.1 Introduction
Conceptual modeling is the process of abstracting a model from a real or
proposed system. It is almost certainly the most important aspect of a simu-
lation project. The design of the model impacts all aspects of the study, in
particular the data requirements, the speed with which the model can be
developed, the validity of the model, the speed of experimentation and the
confidence that is placed in the model results. A well designed model signifi-
cantly enhances the possibility that a simulation study will be a success.
Although effective conceptual modeling is a vital aspect of a simulation
study, it is probably the most difficult and least understood (Law 1991). There
is surprisingly little written on the subject. It is difficult to find a book that
devotes more than a handful of pages to the design of the conceptual model.
3
4 Conceptual Modeling for Discrete-Event Simulation
Neither are there a plethora of research papers, with only a handful of well
regarded papers over the last four decades. A search through the academic
tracks at major simulation conferences on discrete-event simulation reveals a
host of papers on other aspects of simulation modeling. There are, however,
only a few papers that give any space to the subject of conceptual modeling.
The main reason for this lack of attention is probably due to the fact that
conceptual modeling is more of an ‘art’ than a ‘science’ and therefore it is
difficult to define methods and procedures. Whatever the reason, the result
is that the art of conceptual modeling is largely learnt by experience. This
somewhat ad hoc approach does not seem satisfactory for such an important
part of the simulation modeling process.
The purpose of this chapter is to bring more clarity to the area of conceptual
modeling for simulation. The issue is addressed first by defining the meaning
of conceptual modeling and then by establishing the requirements of a con-
ceptual model. The meaning of the term conceptual model is discussed in rela-
tion to existing definitions in the literature. A refined definition of a conceptual
model is then given and the scope of conceptual modeling is defined. There
is a pause for thought concerning the purpose of a conceptual model before a
discussion on the requirements of a conceptual model. The chapter finishes
with a brief review of the guidance that is available for conceptual modeling.
The domain of interest for this discussion is primarily in the use of dis-
crete-event simulation for modeling operations systems or operating systems. “An
operating system is a configuration of resources combined for the provision of
goods or services” (Wild 2002). Wild identifies four specific functions of opera-
tions systems: manufacture, transport, supply, and service. This is one of the
prime domains for simulation in operational research. We might refer to it as
“business-oriented” simulation while interpreting business in its widest sense
to include, for instance, the public sector and health. Models in this domain
tend to be of a relatively small scale, with a project life cycle of normally less
than 6 months (Cochran et al. 1995). The models are generally developed by
a lone modeler acting as an external or internal consultant. Sometimes the
models are developed on a “do-it-yourself” basis with a subject matter expert
carrying out the development. This is somewhat different to the nature of
zsimulation modeling in the military domain, another major application of
simulation in operational research, where models tend to be of a much larger
scale and where they are developed by teams of people (Robinson 2002).
Although the focus is on discrete-event simulation for modeling operations
systems, this is not to say that the concepts do not have wider applicability.
Throughout the chapter, three roles in a simulation study are assumed:
Head line
Line A
Fig ur e 1.1
Schematic showing the layout of the South Wales Engine Assembly Plant.
6 Conceptual Modeling for Discrete-Event Simulation
head before the complete subassembly is joined with the engine block on
Line A. On leaving Line A, the engine is loaded to a Line B platen to continue
the assembly process. The empty Line A platen is washed and returned so a
new engine block can be loaded. At the end of Line B, completed engines are
off-loaded and move to the Hot Test facility. In Hot Test, engines are rigged to
test machines, run for a few minutes and monitored. Engines that pass Hot
Test move to the Final Dress area for completion. Engines that fail Hot Test
are rectified and then completed.
The majority of the operations on the three main assembly lines consist of
a single automatic machine. Some operations require two parallel machines
due to the length of the machine cycle, while a few other operations are
performed manually. At various points along the line there are automatic
test stations. When an engine fails the test, it is sent to an adjoining rework
station, before returning to be tested again. All the operations are connected
by a powered roller conveyor system.
The key components are the engine block, head, crankshaft, cam shaft,
and connecting rods. These are produced at nearby production facilities,
delivered to the main assembly plant and stored line-side ready for assembly.
Because various engine derivatives are made on the assembly line, a range
of component derivatives need to be produced and stored for assembly. The
result was the concern over scheduling the production and the storage of
these key components.
As with all such projects, time for developing and using the model was
limited. It was important, therefore, to devise a model that could answer
the questions about scheduling key components as quickly as possible while
maintaining a satisfactory level of accuracy.
In consideration the nature of the problem, it was clear that the key issue
was not so much the rate at which engines progressed through the assembly
line, but their sequence. The initial sequence of engines was determined by
the production schedule, but this sequence was then disturbed by engines
being taken out for rework and by the presence of parallel machines for some
operations. Under normal operation the parallel machines would not cause
a change in the sequence of engines on the line, but if one of the machines
breaks down for a period, then the engines queuing for that machine would
be delayed and their sequence altered.
It was recommended that the simulation model should represent in
detail those elements that determined the sequence of engines on the main
assembly line, that is, the schedule, the test and rework areas, and the parallel
machines. All other operations could be simplified by grouping sections of
the line that consisted of individual machines and representing them as a
queue with a delay. The queue capacity needed to equate to the capacity of
that section of the line. The delay needed to be equal to the time it took for
an engine to pass through the section of the line, allowing for breakdowns.
This would give a reasonable approximation to the rate at which engines
would progress through the facility. Of course, the operations where the key
Conceptual Modeling for Simulation: Definition and Requirements 7
fully known since full knowledge of the real system cannot be attained. For
instance, almost all systems involve some level of human interaction that
will affect its performance. This interaction cannot be fully understood since
it will vary from person to person and time to time.
In the lumped model the components of a model are lumped together and
simplified. The aim is to generate a model that is valid within the experi-
mental frame, that is, reproduces the input–output behaviors with sufficient
fidelity. The structure of the lumped model is fully known. Returning to the
example of human interaction with a system, in a lumped model specific
rules for interaction are devised, e.g., a customer will not join a waiting line
of more than 10 people.
Nance (1994) separates the ideas of conceptual model and communicative
model. The conceptual model exists in the mind of a modeler, the commu-
nicative model is an explicit representation of the conceptual model. He also
specifies that the conceptual model is separate from model execution. In
other words, the conceptual model is not concerned with how the computer-
based model is coded. Fishwick (1995) takes a similar view, stating that a
conceptual model is vague and ambiguous. It is then refined into a more
concrete executable model. The process of model design is about developing
and refining this vague and ambiguous model and creating the model code.
In these terms, conceptual modeling is a subset of model design, which also
includes the design of the model code.
The main debate about conceptual modeling and its definition has been
held among military simulation modelers. Pace has lead the way in this
debate and defines a conceptual model as “a simulation developer’s way of
translating modeling requirements … into a detailed design framework …,
from which the software that will make up the simulation can be built” (Pace
1999). In short, the conceptual model defines what is to be represented and
how it is to be represented in the simulation. Pace sees conceptual model-
ing as being quite narrow in scope viewing objectives and requirements
definition as precursors to the process of conceptual modeling. The concep-
tual model is largely independent of software design and implementation
decisions. Pace (2000a) identifies the information provided by a conceptual
model as consisting of assumptions, algorithms, characteristics, relation-
ships, and data.
Lacy et al. (2001) further this discussion, reporting on a meeting of the
Defense Modelling and Simulation Office (DMSO) to try and reach a consen-
sus on the definition of a conceptual model. The paper describes a plethora
of views, but concludes by identifying two types of conceptual model. A
domain-oriented model that provides a detailed representation of the problem
domain and a design-oriented model that describes in detail the requirements
of the model. The latter is used to design the model code. Meanwhile, Haddix
(2001) points out that there is some confusion over whether the conceptual
model is an artifact of the user or the designer. This may, to some extent, be
clarified by adopting the two definitions above.
10 Conceptual Modeling for Discrete-Event Simulation
It is clear, however, that complete agreement does not exist over these facets.
Co
Real world nc
ep
(problem situation) tu
al
n
tio
m
od
nta
eli
ng
me
Conceptual model
ple
D
Im
or eter
Modeling and re mi
ne as ne
mi general project on a
ter s f chi
Solutions/ De objectives or ev
fai em
lu en
understanding re t
of,
Accepts Model content: Provides
Experimental scope and Responses
factors level of detail
Inputs Outputs
Ex
pe
rim
en
g
in
tat
od
ion
Computer lc
o de
model M
Fig ur e 1.2
The conceptual model in the simulation project life cycle. (Adapted from Robinson, S.,
Simulation: The Practice of Model Development and Use, Wiley, Chichester, UK, 2004. With
permission.)
of the four processes outlined in Figure 1.2. For a more detailed description of
this life cycle and model verification and validation see Robinson (2004).
Based upon an understanding of the problem situation the conceptual
model is derived. This model is only a partial description of the real world,
but it is sufficient to address the problem situation. The double arrow between
the problem situation and objectives signifies the interplay between prob-
lem understanding and modeling. While the conceptual model reflects the
understanding of the problem situation, the process of developing the con-
ceptual model also changes the understanding of the problem situation. In
particular, the nature of the questions that the modeler asks during concep-
tual modeling can lead to new insights on behalf of the clients and domain
experts. At a greater extreme, ideas derived purely from conceptual model-
ing may be implemented in the real system, changing the actual nature of
the problem situation.
The conceptual model itself consists of four main components: objectives,
inputs (experimental factors), outputs (responses), and model content. Two
types of objective inform a modeling project. First, there are the modeling
objectives, which describe the purpose of the model and modeling project.
Second, there are general project objectives, which include the timescales
for the project and the nature of the model and its use (e.g., requirements
for the flexibility of the model, run-speed, visual display, ease-of-use, and
model/component reuse). The definition of objectives is seen as intrinsic to
12 Conceptual Modeling for Discrete-Event Simulation
decisions about the conceptual model. The Ford example above highlighted
how different modeling objectives led to different models. Similarly, the gen-
eral project objectives can affect the nature of the model. A shorter timescale,
for instance, may require a simpler conceptual model than would have been
devised had more time been available. For this reason, the objectives are
included in the definition of the conceptual model.
Including the modeling objectives as part of the definition of a conceptual
model is at odds with Pace (1999). He sees the objectives and requirements
definition as separate from the conceptual model. The author’s view is that
while understanding the problem situation and the aims of the organization
lies within the domain of the real world (problem situation), the modeling
objectives are specific to a particular model and modeling exercise. Different
modeling objectives lead to different models within the same problem
situation, as in the Ford example. As a result, the modeling objectives are
intrinsic to the description of a conceptual model. Without the modeling
objectives, the description of a conceptual model is incomplete.
The inputs (or experimental factors) are those elements of the model that
can be altered to effect an improvement in, or better understanding of, the
problem situation. They are determined by the objectives. Meanwhile, the
outputs (or responses) report the results from a run of the simulation model.
These have two purposes: first, to determine whether the modeling objec-
tives have been achieved; second, to point to reasons why the objectives are
not being achieved, if they are not.
Finally, the model content consists of the components that are represented
in the model and their interconnections. The content can be split into two
dimensions (Robinson 1994):
• The scope of the model: the model boundary or the breadth of the real
system that is to be included in the model.
• The level of detail: the detail to be included for each component in the
model’s scope.
The model content is determined, in part, by the inputs and outputs, in that
the model must be able to accept and interpret the inputs and to provide the
required outputs. The model content is also determined by the level of accuracy
required. More accuracy generally requires a greater scope and level of detail.
While making decisions about the content of the model, various assump-
tions and simplifications are normally introduced. These are defined as
follows:
This definition adds the point that the conceptual model is non-software-
specific in line with the views of the other authors described above.
Considerations as to how the model code will be developed (whether it be
a spreadsheet, specialist software, or a programming language) should not
dominate debate around the nature of the model that is required to address
the problem situation. Conceptual modeling is about determining the right
model, not how the software will be implemented.
In saying this, it must be recognized that many simulation modelers only
have access to one or possibly two simulation tools. As a result, considerations
of software implementation will naturally enter the debate about the nature
of the conceptual model. This is recognized by the double arrow, signifying
iteration, for the model coding process in Figure 1.2. What this definition
for a conceptual model aims to highlight is the importance of separating as
far as possible detailed model code considerations from decisions about the
conceptual design.
The definition does not place the conceptual model at a specific point in time
during a simulation study. This reflects the level of iteration that may exist in
simulation work. A conceptual model may reflect a model that is to be devel-
oped, is being developed or has been developed in some software. The model
is continually changing as the simulation study progresses. Whatever stage has
been reached in a simulation study, the conceptual model is a non-software-
specific description of the model as it is understood at that point in time.
It should also be noted that this definition does not imply that a formal and
explicit conceptual model is developed. Indeed, the conceptual model may
not be formally expressed (not fully documented) and/or it may be implicit
(in the mind of the modeler). Whether the conceptual model is explicit or not,
it still exists, in that the modeler has made decisions about what to model
and what not to model. For the rest of this chapter we shall assume that the
conceptual model is made explicit, in a more or less formal fashion, with a
view to reaching a joint agreement between the modeler, clients, and domain
experts concerning its content.
14 Conceptual Modeling for Discrete-Event Simulation
These activities are explored in more detail in chapter 4. This list suggests a
general order in which the elements of a conceptual model might be deter-
mined. There is likely to be a lot of iteration forwards and backwards between
these activities. Further to this, there is iteration between conceptual mod-
eling and the rest of the process of model development and use (Robinson
2004). Although the conceptual model should be independent of the model-
ing software, it must be recognized that there is an interplay between the
two. Since many modelers use the software that they are familiar with, it is
possible (although not necessarily desirable) that methods of representation
and limitations in the software will cause a revision to the conceptual model.
Continued learning during model coding and experimentation may cause
adjustments to the conceptual model as the understanding of the problem
situation and modeling objectives change. Model validation activities may
result in alterations to the conceptual model in order to improve the accuracy
of the model. Availability, or otherwise, of data may require adjustments to
the conceptual model. All this implies a great deal of iteration in the pro-
cess of modeling and the requirement to continually revise the conceptual
model. This iteration is illustrated by the double arrows between the stages
in Figure 1.2.
Albeit that this argument appears to have some credence, it ignores the
fact that whatever practice a modeler might employ for developing the model
code, decisions still have to be taken concerning the content and assump-
tions of the model. Modern simulation software does not reduce this level of
decision-making. What the software can provide is an environment for the
more rapid development of the model code, enhancing the opportunities for
iteration between conceptual modeling and model coding, and facilitating
rapid prototyping. This does not negate the need for conceptual modeling,
but simply aids the process of model design. It also highlights the point that
conceptual modeling is not a one-off step, but part of a highly iterative pro-
cess, particularly in relation to model coding.
Indeed, the power of modern software (and hardware) and the wider use
of distributed processing may actually have increased the need for effec-
tive conceptual modeling. Salt (1993) and Chwif et al. (2000) both identify
the problem of the increasing complexity of simulation models; a result
of the “possibility” factor. People build more complex models because the
hardware and software enable them to. While this may have extended the
utility of simulation to problems that previously could not have been tack-
led, it also breads a tendency to develop overly complex models. There are
various problems associated with such models including extended develop-
ment times and onerous data requirements. This trend to develop ever more
complex models has been particularly prevalent in the military domain
(Lucas and McGunnigle 2003). Indeed, it could be argued that there are
some advantages in only having limited computing capacity; it forces the
modeler to carefully design the model! As a result of the possibility factor it
would seem that careful design of the conceptual model is more important
than ever.
Beyond the general sense that careful model design is important, there
are a number of reasons why a conceptual model is important to the devel-
opment and use of simulation models. Pace (2003) puts this succinctly by
stating that the conceptual model provides a roadmap from the problem
situation and objectives to model design and software implementation. He
also recognizes that the conceptual model forms an important part of the
documentation for a model. More specifically, a well-documented concep-
tual model does the following:
Overall the conceptual model, if made explicit and clearly expressed, pro-
vides a means of communication between all parties in a simulation study:
the modeler, clients, and domain experts (Pace 2002). In so doing it helps to
build a consensus, or least an accommodation, about the nature of the model
and its use.
The clients must believe that the model is sufficiently accurate. Included in
this concept is the need for the clients to be convinced that all the important
components and relationships are in the model. Credibility also requires that
the model and its results are understood by the clients. Would a model that
could not be understood have credibility? An important factor in this respect
is the transparency of the model, which is discussed below.
Validity and credibility are seen as separate requirements because the
modeler and clients may have very different perceptions of the same model.
Although a modeler may be satisfied with a conceptual model, the clients
may not be. It is not unusual for additional scope and detail to be added
to a model, not because it improves its validity, but because it improves its
credibility. Not that adding scope and detail to gain credibility is necessarily
a bad thing, but the modeler must ensure that this does not progress so far
that the model becomes over complex. Simulation is particularly prone to
such a drift through, for instance, the addition of nonvital graphics and the
logic required to drive them.
18
Table 1.1
Requirements of a Conceptual Model Related to those Documented in the Literature
Documented Requirements
van der Zee
Proposed Henriksen and Van der
Requirements Pritsker (1986) (1988) Nance (1994) Willemain (1994) Brooks and Tobias (1996a) Vorst (2005)
Validity Valid Fidelity Model Validity Model describes behavior of Completeness
correctness Aptness for interest
Testability client’s problem Accuracy of the model’s results
Probability of containing errors
Validity
Strength of theoretical basis of
model
Credibility Understandable Ease of understanding Transparency
Utility Extendible Execution speed Adaptability Value to client Portability and ease with
Ease of Reusability Usability which model can be
modification Maintainability combined with others
Feasibility Timely Elegance Feasibility Time and cost to build model
Time and cost to run model
Time and cost to analyze
results
Hardware requirements
Conceptual Modeling for Discrete-Event Simulation
Conceptual Modeling for Simulation: Definition and Requirements 19
A perception, on behalf of the modeler and the clients, that the concep-
tual model can be developed into a computer model that is useful as an
aid to decision-making within the specified context.
Utility is seen as a joint agreement between the modeler and the cli-
ents about the usefulness of the model. This notion moves beyond the
question of whether the model is sufficiently accurate, to the question of
whether the model is useful for the context of the simulation study. Utility
includes issues such as ease-of-use, flexibility (i.e., ease with which model
changes can be made), run-speed and visual display. Where the model, or
a component of the model, might be used again on the same or another
study, reusability would also be subsumed within the concept of utility.
The requirements for utility are expressed through the general project
objectives.
Within any context a range of conceptual models could be derived. The
accuracy of these models would vary, but some or all might be seen as suffi-
ciently accurate and, hence, under the definitions given above, they would be
described as valid and credible. This does not necessarily mean that the mod-
els are useful. For instance, if a proposed model is large and c umbersome, it
may have limited utility due to reduced ease-of-use and flexibility. Indeed,
a less accurate (but still sufficiently accurate), more flexible model that runs
faster may have greater utility by enabling a wider range of experimentation
within a timeframe.
Hodges (1991) provides an interesting discussion around model utility
and suggests that a “bad” model (one that is not sufficiently accurate) can
still be useful. He goes on to identify specific uses for such models. Bankes
(1993) continues with this theme, discussing the idea of inaccurate models for
exploratory use, while Robinson (2001) sees a role for such models in facilitat-
ing learning about a problem situation.
The final requirement, feasibility, is defined as follows:
A perception, on behalf of the modeler and the clients, that the con-
ceptual model can be developed into a computer model with the time,
resource and data available.
With more complex models these advantages are generally lost. Indeed, at
the center of good modeling practice is the idea of resorting to the simplest
explanation possible. Occam’s razor puts this succinctly, “plurality should not
be posited without necessity” (William of Occam; quoted from Pidd 2003),
as does Antoine de Saint-Exupery, who reputedly said that “perfection is
achieved, not when there is nothing more to add, but when there is nothing
left to take away.”
The requirement for simple models does not negate the need to build
complex models on some occasions. Indeed, complex models are sometimes
required to achieve the modeling objectives. The requirement is to build the
simplest model possible, not simple models per se. What should be avoided,
however, is the tendency to try and model every aspect of a system when a
far simpler more focused model would suffice.
The graph in Figure 1.3 illustrates the notional relationship between
model accuracy and complexity (Robinson 1994). Increasing levels of
complexity (scope and level of detail) improve the accuracy of the model,
but with diminishing returns. Beyond point x there is little to be gained by
adding to the complexity of the model. A 100% accurate model will never
be achieved because it is impossible to know everything about the real sys-
tem. The graph illustrates a further point. Increasing the complexity of the
Conceptual Modeling for Simulation: Definition and Requirements 21
100%
Model accuracy
x
Scope and level of detail (complexity)
Fig ur e 1.3
Simulation model complexity and accuracy. (Adapted from Robinson, S., Industrial Engineering,
26 (9), 34–36, 1994. With permission.)
model too far, may lead to a less accurate model. This is because the data
and information are not available to support such a detailed model. For
instance, it is unlikely that we could accurately model the exact behavior
of individuals in a queue, and attempts to do so, beyond very simple rules,
may lead to a less accurate result.
Ward (1989) provides a lucid account on the simplicity of models. In doing
so, he makes a useful distinction between constructive simplicity and trans-
parency. Transparency is an attribute of the client (how well he/she under-
stands the model), while constructive simplicity is an attribute of the model
itself (the simplicity of the model). Because transparency is an attribute of
the client, it depends on his/her level of knowledge and skill. A model that
is transparent to one client may not be transparent to another. In develop-
ing a conceptual model, the modeler must consider transparency as well as
simplicity, designing the model with the particular needs of the client in
mind. The need for transparency is, of course, confounded by the presence
of multiple clients (as is the case in many simulation studies), all of whom
must be satisfied with the model. These ideas closely link to the requirement
for credibility, as discussed above, since a model that is not transparent is
unlikely to have credibility.
Having emphasized the importance of simplicity, there are those that
warn against taking this to an extreme. Pritsker (1986) reflects on his experi-
ence of developing models of differing complexity of the same system. He
concludes that the simplest model is not always best because models need
to be able to evolve as the requirements change. The simplest model is not
always the easiest to embellish. Schruben and Yücesan (1993) make a similar
point, stating that simpler models are not always as easy to understand, code
and debug. Davies et al. (2003) point out that simpler models require more
extensive assumptions about how a system works and that there is a danger
in setting the system boundary (scope) too narrow in case an important facet
is missed.
22 Conceptual Modeling for Discrete-Event Simulation
The central theme is one of aiming for simple models through evolutionary
development. Others have produced similar sets of principles (or guidelines),
for instance, Morris (1967), Musselman (1992), Powell (1995), Pritsker (1998),
and Law (2007). The specific idea of evolutionary model development is fur-
ther explored by Nydick et al. (2002).
Conceptual Modeling for Simulation: Definition and Requirements 23
model. Robinson (1994) also lists some methods for simplifying simulation
models. Finally, Webster et al. (1984) describe how they selected an appropri-
ate level of detail for generating samples in a timber harvesting simulation
model.
Such ideas are useful for simplifying an existing (conceptual) model,
but they do not guide the modeler over how to bring a model into exis-
tence. Model simplification acts primarily as a redesign tool and not a
design tool.
security systems. Meanwhile, van der Zee and Van der Vorst propose a
framework for supply chain simulation. Both are aimed at an object-oriented
implementation of the computer-based simulation model. Meanwhile,
Kotiadis (2007) looks to the ideas of Soft Operational Research, and spe-
cifically soft systems methodology (SSM) (Checkland 1981), for aiding the
conceptual modeling process. She uses SSM to help understand a complex
health care system and then derives the simulation conceptual model from
the SSM “purposeful activity model.”
In this book, Robinson proposes a conceptual modeling framework that
guides a modeler from identification of the modeling objectives through
to determining the scope and level of detail of a model (chapter 4). Arbez
and Birta describe the ABCmod conceptual modeling framework that
provides a procedure for identifying the components and relationships
for a discrete-event simulation model (chapter 6). Meanwhile, van der
Zee describes a domain-specific framework for developing conceptual
models of manufacturing systems (chapter 5). Karagöz and Demirörs
describe and compare a number of frameworks that have largely been
developed for the military domain (chapter 7), and Haydon explains how
he approaches conceptual modeling from a practice-based perspective
(chapter 8).
Such frameworks appear to have potential for aiding the development of
conceptual models, but they are not yet fully developed and tested, nor are
they in common use. An interesting issue is whether frameworks should be
aimed at a specific domain (e.g., supply chain), or whether it is feasible to
devise more generic frameworks.
1.7 Conclusion
There is, in large measure, a vacuum of research in the area of concep-
tual modeling for discrete-event simulation. Albeit that many simulation
researchers consider effective conceptual modeling to be vital to the success
of a simulation study, there have been few attempts to develop definitions
and approaches that are helpful to the development of conceptual models.
The discussion above attempts to redress this balance by offering a defini-
tion of a conceptual model and outlining the requirements for a conceptual
model. The conceptual model definition is useful for providing a sense of
direction to simulation modelers during a simulation study. If they do not
know what they are heading for, how can they head for it? The require-
ments provide a means for determining the appropriateness of a conceptual
model both during and after development. For researchers, the definition
and requirements provide a common foundation for further research in
conceptual modeling.
26 Conceptual Modeling for Discrete-Event Simulation
Acknowledgments
This chapter is reproduced, with minor editing, from: Robinson, S. 2008.
Conceptual modeling for simulation part I: Definition and requirements.
Journal of the Operational Research Society 59 (3): 278–290. © 2008 Operational
Research Society Ltd. Reproduced with permission of Palgrave Macmillan.
Some sections of this chapter are based on the following:
The Ford engine plant example is used with the permission of John Ladbrook,
Ford Motor Company.
References
Arthur, J.D., and R.E. Nance. 2007. Investigating the use of software requirements
engineering techniques in simulation modelling. Journal of simulation 1 (3):
159–174.
Balci, O. 1994. Validation, verification, and testing techniques throughout the life
cycle of a simulation study. Annals of operations research 53: 121–173.
Balci, O. 2001. A methodology for certification of modeling and simulation applica-
tions. ACM transactions on modeling and computer simulation 11 (4): 352–377.
Balci, O., and R.E. Nance. 1985. Formulated problem verification as an explicit require-
ment of model credibility. Simulation 45 (2): 76–86.
Bankes, S. 1993. Exploratory modeling for policy analysis. Operations research 41 (3):
435–449.
Conceptual Modeling for Simulation: Definition and Requirements 27
Borah, J.J. 2002. Conceptual modeling: The missing link of simulation development. In
Proceedings of the 2002 Spring Simulation Interoperability Workshop. www.sisostds.
org (accessed February 10, 2009).
Brooks, R.J., and A.M. Tobias. 1996a. Choosing the best model: Level of detail, com-
plexity and model performance. Mathematical and computer modeling 24 (4): 1–14.
Brooks, R.J., and A.M. Tobias. 1996b. A framework for choosing the best model
structure in mathematical and computer modeling. In Proceedings of the 6th
Annual Conference AI, Simulation, and Planning in High Autonomy Systems, 53–60.
University of Arizona.
Carson, J.S. 1986. Convincing users of model’s validity is challenging aspect of mod-
eler’s job. Industrial engineering 18 (6): 74–85.
Checkland, P.B. 1981. Systems Thinking, Systems Practice. Chichester, UK: Wiley.
Chwif, L., M.R.P. Barretto, and R.J. Paul. 2000. On simulation model complexity. In
Proceedings of the 2000 Winter Simulation Conference, ed. J.A. Joines, R.R. Barton,
K. Kang, and P.A. Fishwick, 449–455. Piscataway, NJ: IEEE.
Cochran, J.K., G.T. Mackulak, and P.A. Savory. 1995. Simulation project characteristics
in industrial settings. Interfaces 25 (4): 104–113.
Courtois, P.J. 1985. On time and space decomposition of complex structures.
Communications of the ACM 28 (6): 590–603.
Davies, R., P. Roderick, and J. Raftery, J. 2003. The evaluation of disease prevention
and treatment using simulation models. European journal of operational research
150: 53–66.
DMSO. 2005. Defense Modeling and Simulation Office, HLA. www.sisostds.org
(accessed February 10, 2009).
Evans, J.R. 1992. Creativity in MS/OR: improving problem solving through creative
thinking. Interfaces 22 (2): 87–91.
Ferguson, P., W.S. Humphrey, S., Khajenoori, et al. 1997. Results of applying the per-
sonal software process. Computer 30 (5): 24–31.
Fishwick, P.A. 1995. Simulation Model Design and Execution: Building Digital Worlds.
Upper Saddle River, NJ: Prentice-Hall.
Gass, S.I., and L.S. Joel. 1981. Concepts of model confidence. Computers and operations
research 8 (4): 341–346.
Guru, A., and P. Savory. 2004. A template-based conceptual modeling infrastructure
for simulation of physical security systems. In Proceedings of the 2004 Winter
Simulation Conference, ed. R.G. Ingalls, M.D. Rossetti, J.S. Smith, and B.A. Peters,
866–873. Piscataway, NJ: IEEE.
Haddix, F. 2001. Conceptual modeling revisited: a developmental model approach for
modeling and simulation. In Proceedings of the 2001 Fall Simulation Interoperability
Workshop. www.sisostds.org (accessed February 10, 2009).
Henriksen, J.O. 1988. One system, several perspectives, many models. In Proceedings
of the 1988 Winter Simulation Conference, ed. M. Abrams, P. Haigh, and J. Comfort:
352–356. Piscataway, NJ: IEEE.
Henriksen, J.O. 1989. Alternative modeling perspectives: Finding the creative spark.
In Proceedings of the 1989 Winter Simulation Conference, ed. E.A. MacNair, K.J.
Musselman, and P. Heidelberger, 648–652. Piscataway, NJ: IEEE.
Hodges, J.S. 1991. Six (or so) things you can do with a bad model. Operations research
39 (3): 355–365.
Innis, G., and E. Rexstad. 1983. Simulation model simplification techniques. Simulation
41 (1): 7–15.
28 Conceptual Modeling for Discrete-Event Simulation
Kotiadis, K. 2007. Using soft systems methodology to determine the simulation study
objectives. Journal of simulation 1 (3): 215–222.
Lacy, L.W., W. Randolph, B. Harris, et al. 2001. Developing a consensus perspective
on conceptual models for simulation systems. In Proceedings of the 2001 Spring
Simulation Interoperability Workshop. www.sisostds.org (accessed February 10,
2009).
Law, A.M. 1991. Simulation model’s level of detail determines effectiveness. Industrial
engineering 23 (10): 16–18.
Law, A.M. 2007. Simulation Modeling and Analysis, 4th ed. New York: McGraw-Hill.
Lucas, T.W., and J.E. McGunnigle. 2003. When is model complexity too much?
Illustrating the benefits of simple models with Hughes’ salvo equations. Naval
research logistics 50: 197–217.
Morris, W.T. 1967. On the art of modeling. Management science 13 (12): B707–717.
Musselman, K.J. 1992. Conducting a successful simulation project. In Proceedings of the
1992 Winter Simulation Conference, ed. J.J. Swain, D. Goldsman, R.C. Crain, and
J.R. Wilson, 115–121. Piscataway, NJ: IEEE.
Nance, R.E. 1994. The conical methodology and the evolution of simulation model
development. Annals of operations research 53: 1–45.
Nydick, R.L., M.J. Liberatore, and Q.B. Chung. 2002. Modeling by elaboration: An
application to visual process simulation. INFOR 40 (4): 347–361.
Ören, T.I. 1981. Concepts and criteria to assess acceptability of simulation studies: A
frame of reference. Communications of the ACM 28 (2): 190–201.
Ören, T.I. 1984. Quality assurance in modeling and simulation: a taxonomy. In
Simulation and model-based methodologies: An integrative approach, ed. T.I. Ören,
B.P. Zeigler, and M.S. Elzas, 477–517. Heidelberg, Germany: Springer-Verlag.
Pace, D.K. 1999. Development and documentation of a simulation conceptual model.
In Proceedings of the 1999 Fall Simulation Interoperability Workshop. www.sisostds.
org (accessed February 10, 2009).
Pace, D.K. 2000a. Simulation conceptual model development. In Proceedings of the
2000 Spring Simulation Interoperability Workshop. www.sisostds.org (accessed
February 10, 2009).
Pace, D.K. 2000b. Ideas about simulation conceptual model development. Johns
Hopkins APL technical digest 21 (3): 327–336.
Pace, D.K. 2002. The value of a quality simulation conceptual model. Modeling and
simulation magazine 1 (1): 9–10.
Pace, D.K. 2003. Thoughts about the simulation conceptual model. In Proceedings of
the 2003 spring simulation interoperability workshop. www.sisostds.org (accessed
February 10, 2009).
Pidd, M. 1999. Just modeling through: a rough guide to modeling. Interfaces 29 (2):
118–132.
Pidd, M. 2003. Tools for Thinking: Modeling in Management Science, 2th ed. Chichester,
UK: Wiley.
Pidd, M. 2004. Computer Simulation in Management Science, 5th ed. Chichester, UK:
Wiley.
Powell. S.G. 1995. Six key modeling heuristics. Interfaces 25 (4): 114–125.
Pritsker, A.A.B. 1986. Model evolution: a rotary table case history. In Proceedings of the
1986 Winter Simulation Conference, ed. J. Wilson, J. Henriksen, and S. Roberts,
703–707. Piscataway, NJ: IEEE.
Conceptual Modeling for Simulation: Definition and Requirements 29
Pritsker, A.A.B. 1987. Model evolution II: An FMS design problem. In Proceedings of
the 1987 Winter Simulation Conference, ed. A. Thesen, H. Grant, and W.D. Kelton,
567–574., Piscataway, NJ: IEEE.
Pritsker, A.A.B. 1998. Principles of simulation modeling. In Handbook of simulation, ed.
J. Banks, 31–51. New York: Wiley.
Robinson, S. 1994. Simulation projects: Building the right conceptual model. Industrial
Engineering 26 (9): 34–36.
Robinson, S. 2001. Soft with a hard centre: Discrete-event simulation in facilitation.
Journal of the operational research society 52 (8): 905–915.
Robinson, S. 2002. Modes of simulation practice: Approaches to business and military
simulation. Simulation practice and theory 10: 513–523.
Robinson, S. 2004. Simulation: The Practice of Model Development and Use. Chichester,
UK: Wiley.
Robinson, S. 2005. Distributed simulation and simulation practice. Simulation:
Transactions of the society for modeling and computer simulation 81 (1): 5–13.
Robinson, S., and M. Pidd. 1998. Provider and customer expectations of successful
simulation projects. Journal of the operational research society 49 (3): 200–209.
Salt, J. 1993. Simulation should be easy and fun. In Proceedings of the 1993 Winter
Simulation Conference, ed. G.W. Evans, M. Mollaghasemi, E.C. Russell, and W.E.
Biles, 1–5. Piscataway, NJ: IEEE.
Schmeiser, B.W. 2001. Some myths and common errors in simulation experiments. In
Proceedings of the 2001 Winter Simulation Conference, ed. B.A. Peters, J.S. Smith,
D.J. Medeiros, and M.W. Rohrer, 39–46. Piscataway, NJ: IEEE.
Schruben, L., and Yücesan, E. 1993. Complexity of simulation models: A graph theo-
retic approach. In Proceedings of the 1993 Winter Simulation Conference, ed. G.W.
Evans, M. Mollaghasemi, E.C. Russell, and W.E. Biles, 641–649. Piscataway, NJ:
IEEE.
Sevinc, S. 1990. Automation of simplification in discrete event modeling and simula-
tion. International journal of general systems 18: 125–142.
Shannon, R.E. 1975. Systems Simulation: The Art and Science. Englewood Cliffs, NJ:
Prentice-Hall.
Teeuw, W.B., and H. van den Berg. 1997. On the quality of conceptual models. In
Proceedings of the ER ’97 Workshop on Behavioral Models and Design Transformations:
Issues and Opportunities in Conceptual Modeling, ed. S.W. Liddle. osm7.cs.byu.
edu/ER97/workshop4/ (accessed February 10, 2009).
Thomas, A., and P. Charpentier. 2005. Reducing simulation models for scheduling
manufacturing facilities. European journal of operational research 161 (1): 111–125.
van der Zee, D.J., and J.G.A.J. van der Vorst. 2005. A modeling framework for supply
chain simulation: Opportunities for improved decision making. Decision sciences
36 (1): 65–95.
Ward, S.C. 1989. Arguments for constructively simple models. Journal of the operational
research society 40 (2): 141–153.
Webster, D.B., M.L. Padgett, G.S. Hines, et al. 1984. Determining the level of detail in
a simulation model: A case study. Computers and industrial engineering 8 (3/4):
215–225.
Wild, R. 2002. Operations Management, 6th ed. London: Continuum.
Willemain, T.R. 1994. Insights on modeling from a dozen experts. Operations research
42 (2): 213–222.
30 Conceptual Modeling for Discrete-Event Simulation
Willemain, T.R. 1995. Model formulation: What experts think about and when.
Operations research 43 (6): 916–932.
Yin, H.Y., and Z.N. Zhou. 1989. Simplification techniques of simulation models. In
Proceedings of Beijing International Conference on System Simulation and Scientific
Computing, 782–786. Piscataway, NJ: IEEE
Zeigler, B.P. 1976. Theory of Modeling and Simulation. New York: Wiley.
2
Complexity, Level of Detail,
and Model Performance
Roger J. Brooks
Contents
2.1 Introduction................................................................................................... 31
2.2 Choice of the Best Model............................................................................. 32
2.3 Model Performance...................................................................................... 33
2.4 Level of Detail and Complexity.................................................................. 37
2.5 Measuring Model Complexity.................................................................... 40
2.6 Relationship between Model Performance and the Level of Detail
or Complexity of a Model............................................................................42
2.7 Simplification and Other Related Areas.................................................... 46
2.8 Experiment on Model Characteristics and Performance........................ 47
2.9 Conclusions.................................................................................................... 52
Acknowledgments................................................................................................. 53
References................................................................................................................ 53
2.1 Introduction
Mathematical and simulation models are used extensively in many areas of
science and industry from population genetics to climate modeling and from
simulating a factory production line to theories of cosmology. Modeling may
be undertaken for a number of reasons but the most common aim is to pre-
dict the behavior of a system under future circumstances. A model may be
purely predictive or it may be part of a decision making process by predict-
ing the system behavior under alternative decision scenarios. There are other
occasions when a model is just descriptive, simply summarizing the model
er’s understanding of the system (Jeffers 1991). The understanding of the sys-
tem gained by the modeler and the user can also be an important benefit of
the project (Fripp 1985), particularly in scientific research when it can be the
principal objective. Equally, a modeling project may have other objectives
such as helping to design experiments or identifying research requirements.
Despite the great variation in the types of model and their usage, the mode-
ling process itself will take a similar form for most projects and can typically
31
32 Conceptual Modeling for Discrete-Event Simulation
on many different spatial and time scales (Courtois 1985). Finding the best
model is often viewed to a large extent as the problem of choosing the appro-
priate level of detail and this is considered one of the most difficult aspects
of the modeling process (Law 1991) and one that has a major effect on the
successfulness of the project (Tilanus 1985, Ward 1989, Salt 1993).
By viewing the selection of the conceptual model in this way, the alter-
native models are effectively being ordered by the characteristic of level of
detail, which is the most common characteristic used to compare models.
This is done in the hope that there will be similarities with previous stud-
ies in the effect of the level of detail on model performance, so that experi-
ence from these studies can be applied in the selection of the current model.
For example, a model that is too simple will be unrealistic and so its results
will be, at best, of little use and, at worst, misleading. On the other hand,
considerable resources are usually required to build a complex model and
so, if the model is too complex, constraints on resources may prevent the
completion of the project (here, it is assumed that a more detailed model will
be more complex, although the meaning of level of detail and complexity
are discussed further in Section 2.4). It is generally harder to understand the
relationships contained in a complex model and this makes the interpreta-
tion of the results more difficult, possibly leading to incorrect conclusions
being drawn. A complex model is probably more likely to contain errors as it
is harder to verify that the model is working as intended.
The advice given on selecting the level of detail seems to consist almost
entirely of vague principles and general guidelines. A commonly quoted
maxim is Ockham’s (or Occam’s) razor, attributed to the fourteenth-century
philosopher William of Ockham, and translated (from the Latin) as “enti-
ties should not be multiplied without necessity,” or “it is vain to do by more
what can be done by fewer.” In other words, choose the simplest model
that meets the modeling objectives. Often, the advice given is to start from
a simple model and progressively add detail until sufficient accuracy is
obtained. It is important to match the level of detail of the model with the
modeling objectives and with the available data (Law et al. 1993, Jakeman
and Hornberger 1993, Hunt 1994). However, knowledge of these principles is
of only limited use to the modeler and the choice of the best model seems to
be regarded as more of an art than a science.
it is proposed that the full evaluation should include the following 11 per-
formance elements:
Results
1. The extent to which the model describes the behavior of interest (i.e.,
whether it has adequate scope and detail)
2. The accuracy of the model’s results
3. The ease with which the model and its results can be understood
Resources required
8. The time and cost to build the model (including data collection, veri-
fication, and validation)
9. The time and cost to run the model
10. The time and cost to analyze the results of the model
11. The hardware requirements (e.g., computer memory) of running the
model
which they address the modeling objectives (element 1). Clearly, if the results
are inaccurate then the decisions taken and conclusions drawn are likely
to be incorrect, although the degree of accuracy required depends on the
objectives. In certain circumstances, for example, the relative ordering of two
values may be all that is important rather than their absolute values. In the
terms of Zeigler (1976), element 1 is the extent to which the model results
cover the experimental frame. It is also important that the model includes
the system elements whose effects are to be investigated, i.e., the decision
variables (unless the model specifically excludes them to allow comparison
with a previous model that does include them). It is important that the model
and the results can be understood (element 3) to facilitate the analysis, and
increased understanding of the systems may be a significant benefit in itself.
The use of all or part of the model in future projects (element 4) can also be
an important benefit.
In most cases the model is predictive and so elements 1 and 2 cannot be
assessed until some time after the modeling project has been completed
and, indeed, the perceptions of the overall success of the project may change
over time (Robinson and Pidd 1998). Acceptance of the conclusions from the
modeling and implementation of the recommendations requires that the
user has confidence in the model and the user’s confidence should be based
on elements 5–7. It is therefore important not only that (with the benefit of
hindsight) the model produced realistic results (element 2), but also that the
model was seen to be sufficiently realistic at the time the project was car-
ried out (elements 5–7). It is possible for a very unrealistic model to produce
accurate results (for example, due to compensating errors, particularly if the
results are a single value). Even if the results of such a model were accepted
and these led to the correct decisions being taken, the understanding gained
of the system is likely to be incorrect and this may have serious consequences
in the future. Elements 5–7 take this into account by giving an assessment
of the underlying quality of the model. Successful reuse of the model also
requires the model to have a sound basis, as well as requiring the model to
be portable (element 4).
Element 5 relates to the process of verification and so is concerned with
errors occurring in the model construction step. It is not possible, for most
models, to ensure that the model constructed will operate as intended in all
circumstances (i.e., to fully verify it) and so the model may contain errors
(Gass 1983, Tobias 1991). The probability of the model containing a particular
error is a product of the probability of the initial model containing the error,
which partly depends upon the choice of model (as well as the quality of
the model building), and the conditional probability of the error not being
discovered in the verification process, which also partly depends upon the
choice of model (as well as the quality of the verification process). There is
a trade-off here between the performance elements; the model is less likely
to contain errors if more resources are put into building and verifying the
model.
36 Conceptual Modeling for Discrete-Event Simulation
Elements 6 and 7 assess how well the conceptual model matches the real
system. It is not sufficient for the model output simply to fit the historical
data (black box validation); confidence in the model mechanisms, on a theo-
retical basis, by comparison with knowledge of the real system or on the
basis of successful previous experience (white box validation), is also impor-
tant. It is possible for a model to fit the historical data well but to give poor
predictions if the basis of the model is incorrect, particularly if the condi-
tions for the period being predicted are very different to those in the past.
If the system being modeled is one that does not currently exist, validation
can consist only of an assessment of model credibility (Gass 1983), i.e., ele-
ment 7.
The effect of the choice of the model on the costs of the project is addressed
by elements 8–11. In some projects, a model already exists and the aim of
the project is to simplify it or to make it more complex. In this case the time
and cost of building the model becomes the time and cost of modifying the
model. Considerable effort can be required in order to simplify an existing
model (Rexstad and Innis 1985).
An assessment of model performance requires a measurement to be made
of each of the performance elements and this is far from straightforward. It
should be possible to evaluate elements 1, 6, 9, and 11 fairly easily in most
cases, although care is required in the interpretation of the measures used.
However, the remaining elements are hard to quantify and a subjective
qualitative assessment may be all that is be possible. The accuracy of the
results (element 2) may not be known until a long time after the project was
completed and may never be known for decision scenarios that were not
implemented. The ease of understanding and the probability of errors both
contain a human element, which makes a numerical evaluation difficult (ele-
ments 3 and 5). Similarly, the strength of the theory behind the model and
the credibility of its structure is subjective (element 7). A comparison of the
resources required to build and analyze alternative candidate models should
be those required if the model is built from scratch with no prior knowledge
of the system (elements 8 and 10). Such an assessment therefore ought ide-
ally to consist of measuring the resources used by independent modeling
teams of equal modeling experience and ability but this will not be feasible
in most instances. Meaningful measures for the extent to which the scope
and detail of the results matches the problem requirements and particularly
the model portability are also likely to be difficult to derive (elements 1 and
4). An overall assessment of model performance requires a relative weight-
ing to be given to each of the elements and such a weighting will be subjec-
tive and will vary considerably from study to study. It may be possible, for a
particular study, to ignore some of the elements as insignificant in terms of
the overall performance. However, if a number of studies attempt to measure
at least some of the performance elements, the measurement procedure is
likely to improve.
Complexity, Level of Detail, and Model Performance 37
between the variables (Zeigler 1976). There can be many choices for the state
variables. Graphs of the interaction between the state variables were used to
measure the complexity of alternative fate models of toxic substances in a
lake by Halfon (1983a, 1983b). He used Bosserman’s (1982) c̄ measure, which
is the proportion of paths of length ≤ n (where n is the number of nodes)
that exist in the graph. The measure can be obtained using the adjacency
matrix A, which has aij = 1 if there is a connection from node i to node j and
0 otherwise. The matrix Ak, obtained by k multiplications of matrix A by itself
using Boolean algebra, has aij = 1 if and only if there exists a path from node
i to node j of length k. The c̄ measure is then given by the sum of all the ele-
ments in the matrices A, A2,…, An divided by n3 (the total number of elements
in these matrices).
A graph theory measure may not always be static. Neural networks are
typically defined in terms of nodes and connections, and many other adap-
tive systems models can be represented in this way (Farmer 1990), which
gives a natural graph of the model structure. In these models, a complexity
measure based on such a graph would change as the model runs as a result
of the connections changing.
In comparing models, differences in the complexity of the models may
be due to differences in the complexity of the calculations and so the graph
theory measures may be inappropriate or may need to be combined with
other measures.
An alternative approach to graph theory may be to use concepts from infor-
mation theory. Golay et al. (1989) used the following information entropy
measure as a complexity measure:
∑
n
H=− pi log 2 ( pi ) (2.1)
i=1
accuracy index was a validity measure of the fit of the model data against
the actual historical data. They were able to calculate both of these meas-
ures for 26 of the models and they also calculated a combined articulation
and accuracy measure called effectiveness. They found that the models with
the highest descriptive accuracy had low articulation (although the major-
ity of the 26 models had low articulation), i.e., the models with the greatest
validity tended to be those addressing the simpler problems. It is difficult
to draw concrete conclusions from this result as the amount of data is rela-
tively small but Costanza and Sklar hypothesized that, ultimately, greater
articulation necessitates less accuracy and that there might be a level of
articulation that maximizes effectiveness (which they considered to be the
best model). This assumes that greater articulation is desirable, in the sense
that a model with greater articulation provides more information about the
system. Often, however, the modeling objectives are quite specific and only
greater information relevant to the problem (i.e., within the experimental
frame) is a benefit.
Webster et al. (1984) viewed the selection of the level of detail as part of
the validation process and so the only measure they reported was the good-
ness of fit against actual data for the alternative timber harvesting models
that they compared. They considered the appropriate level of detail to be the
simplest model of adequate validity that is consistent with the expected sys-
tem relationships (ignoring the accuracy of results that can only be assessed
subsequently). They used three alternative methods to generate sample data
for three input variables in a simulation model (giving 27 alternatives in all):
mean value, regression, and a histogram of actual data. For one of the vari-
ables they found that the histogram method (which they considered the most
complex level) gave output of lower validity than the simpler methods. Four
of the models gave adequate validity and so they chose the simplest of these
as the final model.
Halfon’s studies (1983a, 1983b) compared the structure of alternative models
of a toxic substance in a lake at six levels of detail. This was done for a model
with six state variables and for a model with 10 state variables and repeated
in each case with and without internal recycling (giving four sets of results).
He compared the structures of the models, mainly using Bosserman’s (1982)
c̄ measure (described earlier), which was applied to the graphs of interactions
between state variables. The level of detail of the models was increased by
adding the physical processes in stages in a logical order. He found that, in
each case, adding the last few levels of detail only caused a small increase in
the number of connections. He argued that it was not worth including these
processes as they are unlikely to affect model behavior significantly and the
additional parameters add to the amount of uncertainty in the model. It is
reasonable to expect diminishing returns as complexity is added. However,
the actual performance of the models was not assessed to confirm this.
Halfon (1983b) also suggested displaying the comparisons of alternative
model structures as a Hasse diagram.
Complexity, Level of Detail, and Model Performance 45
The lack of studies that have specifically sought to examine the effect of
level of detail or complexity on model performance means that even if the
expected relationships described at the beginning of this section are gen-
erally true, the nature of the relationships are unclear (linear, increasing
returns, decreasing returns, etc.) and the circumstances in which the rela-
tionships break down are not understood. The particular elements of model
complexity that have the greatest effect on each performance element have
also not been identified.
Consider, for example, the accuracy of model results. Generally a more
complex model is expected to be more accurate and as the model becomes
more complex the increase in accuracy of adding further complexity is likely
to reduce (assuming that additional detail is added in order of relevance), i.e.,
decreasing returns. Certainly, if there is a mapping between the models so
that the more complex model can be reduced to the simpler model by a suit-
able choice of parameters, then the most accurate complex model must be at
least as accurate as the most accurate simple model. However, the choice is
often between models of different types or between models for which only
an approximate relationship exists. In this case, it is possible for the simpler
model to be more accurate, although a comparison of the complexity of the
models is more difficult. For example, empirical models are sometimes more
accurate than quasi-physically based models, which would generally be
considered to be more complex (Decoursey 1992). For some modeling (such
as physically based distributed parameter models), the input parameters
cannot be directly measured but must be inferred by calibrating the model
against the historical data (Allison 1979). This is called the inverse problem of
parameter identification and its nature means that there may be a wide range
of parameter values that give a good fit. In this case, the results of a model
should be a range of predictions rather than a single prediction (Brooks et
al. 1994) and a more complex model may give a wider range. The range will
depend on the number of parameters in the model and the extent to which
they are allowed to vary (i.e., the size of the parameter space).
There may also be occasions when a simpler model takes longer to build.
Garfinkel (1984) pointed out that in modeling a large system, which is con-
sidered to consist of a large number of subsystems, there is a much greater
choice of simple models (which just model a few subsystems thought to be
important) than complex models and it may take longer to choose between
the alternative simple models than it would have taken to build a model of
the whole system.
Also, a simple model, by incorporating only some of the system elements,
may allow the identification of system relationships that are obscured in a
more complex model and so gives a greater understanding of the system. On
the other hand, a complex model may extend understanding by allowing the
investigation of the effect on the system of many more factors. The process
of identifying, building, and comparing models at different levels of detail
can greatly increase the understanding of the system. Such a process could
46 Conceptual Modeling for Discrete-Event Simulation
be used to link simple strategic models that are difficult to verify with more
detailed tactical models that can be tested against available data (Murdoch et
al. 1992). If the main purpose of the study is gaining an understanding of the
system then the benefits of building models at several levels of detail may be
well worth the additional effort involved.
In computer science software metrics have been developed to control and
predict the performance of software projects (Demarco 1982). Attempts have
been made to predict the resources required for and the likely number of
errors in a piece of software from particular software attributes (such as
“complexity”). Fairly strong relationships have been found within particular
environments (for example, by Boehm [1981]) although none of these appear
to be generally applicable. A similar approach in simulation might help in
predicting the performance of alternative conceptual models.
The discussion in this section indicates that the relationship between the
level of detail or complexity and model performance is more complicated
than some of the comments in the literature would suggest and the lack of
studies in this area means that the relationship is poorly understood. What is
required is a number of studies that measure the elements of the complexity
and performance of alternative models and one such experiment is described
in Section 2.8. This would provide data from which to develop empirical
relationships and may lead to a theoretical basis for the relationships.
1997a, 1997b), wheat simulation (Brooks et al. 2001), and manufacturing sys-
tems (Brooks and Tobias 2000) based on sensitivity analysis and detailed
analysis of the behavior and workings of the model. In each case the proc-
ess of simplification provided important insights into the system behavior
(Brooks and Tobias 1999). Innis and Rexstad (1983) listed and described 17
simplification techniques. These are specific techniques, some of which fall
under Zeigler’s (1976) categories, as well as techniques for replacing part of
the model with a different type. Innis and Rexstad (1983) also included tech-
niques for identifying which parts of the model might be suitable for simpli-
fication, techniques for reducing the number of model runs or run times and
techniques for improving the readability of the model code. They stated that
their list was not exhaustive, and it would appear that a general simplifica-
tion methodology does not exist.
Zeigler’s (1976) DEVS model formalism provides a framework within
which alternative discrete-event simulation models can be compared.
Addanki et al. (1991) proposed representing the alternative models as nodes
on a graph with the edges representing the changes in assumptions from
one model to another. Moving around the graph is an alternative way of
searching the space of models to which Addanki et al. (1991) applied arti-
ficial intelligence techniques. An approach applied to engineering models
has been to generate a database of model fragments and then to automate
the process of selecting and combining the fragments to produce the model
(Falkenheimer and Forbus 1991, Nayak 1992, Gruber 1993). Developments
have also taken place in variable resolution modeling, which allows the
level of detail of the model to be changed easily even while the model is
running (e.g., Davis and Hillestad 1993), and this may be a suitable environ-
ment within which to investigate the effect of level of detail.
one model is built by the same person, building, and analyzing the first
model helps with the next.
As discussed in the previous sections, the complexity of a model is the
most common model characteristic related to performance in the literature,
and yet it is not defined clearly. Section 2.4 proposed that the overall com-
plexity of a model can be considered as a combination of its size (the number
of nodes or elements), its connectedness (the average number of connections
per element), and its calculational complexity (the complexity of the calcula-
tions making up the connections). The aim of the experiment was to examine
the effects of these characteristics and so the models were devised to differ
in these three aspects.
The models used are shown in Figure 2.1. Since they represent production
lines, the natural definition for the elements is machines and buffers with the
connections being the routes taken by the parts. Models A and B both have
eight machines and eight buffers in the same basic layout with model A having
more part routes (23 compared to 19) and hence higher connectedness. Model
C has five machines and five buffers laid out in the same way as a portion of
model A and differs from A mainly in size. Model D has only three machines
and three buffers but has the most complex calculations to determine the part
routes. Model D has high connectedness and calculational complexity.
The models were assigned at random to the students. The students were
quite inexperienced modelers, having received between 14 and 16 hours of
tuition, mainly consisting of hands on experience together with some for-
mal teaching and demonstrations. The first stage of the experiment aimed
to compare how easy the models were to understand. The students were
each asked the same four written questions on aspects of the behavior of
the particular model assigned to them, and were provided with the model
description and selected model output. The second stage focused on model
building and the students were each timed as they built their model using
the WITNESS software (Lanner Group Ltd., Redditch, UK). The number
of errors in each model was subsequently determined (the students were
instructed to build the model as quickly as they could but not to test it). The
results are shown in Table 2.1.
Using analysis of variance (ANOVA), the differences between the mod-
els are statistically significant at the 5% level for build time ( P = 0.032 ),
question 2 ( P = 0.014 ) and question 3 ( P = 0.022 ), but not for question 1,
question 4, the average mark for all questions and the number of errors.
For build time calculational complexity appears to have the most effect
with model D taking considerably longer to build than the other models.
With a package like WITNESS, which is user-friendly and already contains
many of the constructs required, thinking time is the most important com-
ponent of the build time, and so it is the complex and less familiar com-
mands that are the most important. Observations also indicated that the
Complexity, Level of Detail, and Model Performance 49
(a)
Model A 10%
90%
x B2 T1 B4
x,y x
from M1 B1 M2 M3 B6
world 95%
y B3 T2 B5
y y 5% M4 To ship
Scrap
z z
from M5 B7 M6 B8
world
Model B
x
x,y B2 M3 B4
from M1 B1 M2 M5 B6
world
y B3 M4 B5
M6 To ship
z
from M7 B7 M8 B8
world
Model C 10%
90%
x
x,y B2 T1 B4
from M1 B1 M2 M3 To ship
world 95%
y B3 T2 B5
5%
Scrap
Model D 20% of x
B1
x,y
from M1 B2 T1 To ship
world
10% 5% of y
of y Scrap
M2 B3
(b)
Key
Fig ur e 2.1
Process diagrams of the models used in the experiment.
50
Table 2.1
Results from the Experiment
Aspects of Complexity Performance
Build Time Errors
Size Connectedness Average Average
Number of Av. Connections per Calculational Understanding Number of Number of
Model Elements Element Complexity Average Marks Minutes per Model
A 16 1.43 Medium 33% 52.5 2.63
B 16 1.19 Low 45% 44.4 2.25
C 10 1.40 Medium 39% 46.3 1.63
D 6 1.67 High 57% 62.4a 2.89a
a Includes two students who had not quite finished within the maximum time. Six of the 26 errors on this model are
omissions in these two models that are probably due to lack of time.
Conceptual Modeling for Discrete-Event Simulation
Complexity, Level of Detail, and Model Performance 51
aspects of the model that were easy to code were completed very quickly by
the students.
The questions were analyzed both by comparing the marks and by con-
sidering the reasoning process required to answer each question, which is
discussed in detail in Brooks (1996). Students were asked to give a reason for
their answer and considerable importance was given to this since the aim
was to assess understanding. Both the correct answer and correct reason
were required to score 1 mark. If either the answer or reason was only par-
tially correct then 1/2 mark was awarded. An incorrect answer or the correct
answer with an incorrect reason scored 0 marks. As stated above, the signifi-
cant differences between the models were on questions 2 and 3. Question 2
asked “Which machine(s) is the bottleneck?” and the average mark was much
higher for the model D participants (72%) than for the other models (19%,
31%, and 25% for A, B, and C, respectively). The small size of model D made
this question easier to answer because there are fewer elements to compare
to identify the bottleneck. In fact machine M2 is rarely in operation and so
this question only required comparing two machines. This also meant that
there were more acceptable reasons for the correct answer than for the other
models. Question 3 asked “Which buffer(s) were full at some time during
the period?” and could be answered by identifying blocked machines from
the output statistics. The average marks were much higher for models B and
D (81 and 72%, respectively) than for models A and C (25 and 44%, respec-
tively). Again this reflects the question being inherently easier for models B
and D since the blocked machines only sent parts to one buffer, whereas in
models A and C they sent parts to several buffers. Therefore, the difference
in marks seems to be a result of lower connectedness in the critical section
of the models.
The marks were not statistically significant at the 5% level for questions
1 and 4. Question 1 (“How many parts were sent to SHIP in the period?”)
was expected to be harder for model D since the calculation is more com-
plex but, in fact, the average mark was similar to that for models A and B
perhaps again reflecting that the small size means that it is easier to iden-
tify the correct part of the model to focus on. Question 4 (“Estimate the %
increase in output if [a given machine] cycle time is reduced to 10,” where
the given machine was chosen not to be the bottleneck) was expected to be
easier for model D, but the marks were only slightly higher than for the other
models.
Overall the indication is that the difficulty in understanding is mainly
affected by size and connectedness with calculational complexity being
much less important, although this of course depends on the specific ques-
tion being considered. This is probably because the fine details can often be
ignored in understanding the system with just an appreciation of which ele-
ments influence each other being required.
Most of the model building errors for models A, B, and C occurred in
the input and output rules for assembly machines, which were relatively
52 Conceptual Modeling for Discrete-Event Simulation
complex commands that the students were less familiar with. The number
of errors therefore reflects the comparative occurrence of these commands
in the models, with models A and B having two assembly machines and
model C one (each error was counted separately including repeated errors).
Most of the errors for model D were either omissions or occurred in a
complex command unique to model D. Generally, the majority of errors
are likely to occur in the more complex aspects of the model, and so the
number of errors is expected to be most closely related to calculational
complexity.
The sample sizes here for each model (8 or 9) are small and the results will
depend to some extent on the type of models used and the questions asked.
The results can therefore only suggest possible relationships between model
attributes and performance and more work is required to investigate this
further.
2.9 Conclusions
The lack of research into the process of choosing the best model is sur-
prising given the importance of modeling in science. There are very few
studies that have made any quantitative assessment of the effect of differ-
ent model attributes on the modeling process. This probably stems from
the difficulty in measuring either suitable attributes or model performance
and also the effort required to build several alternative models. Different
models are most often compared by their level of detail or complexity
although such a comparison is usually only qualitative and level of detail
and complexity are usually not defined clearly. This chapter introduces
the more specific model characteristics of size, connectedness, and calcu-
lational complexity.
The lack of model comparisons has resulted in only vague guidelines to
aid the choice of model. The initial requirement is for a considerable number
of studies that compare, preferably quantitatively, some aspects of model per-
formance for alternative models. This chapter describes a small-scale study
of this type, which indicated that the difficulty in understanding the model
and the results is mainly caused by size and connectedness, whereas build
time is mainly related to calculational complexity.
A common piece of advice in conceptual modeling and choosing the level
of detail is to use past experience and so, at the very least, the quantitative
comparison of alternative models would provide a source of modeling expe-
rience from which to draw. Ultimately this approach could lead to the devel-
opment of general principles and hopefully to a methodology for choosing
the best model. A corresponding methodology for simplification is also
necessary.
Complexity, Level of Detail, and Model Performance 53
Acknowledgments
Some sections of this chapter are based on Brooks, R. J., and A. M. Tobias.
1996. Choosing the best model: Level of detail, complexity and model per-
formance. Mathematical and Computer Modelling 24(4):1–14.
References
Addanki, S., R. Cremonini, and J. S. Penberthy. 1991. Graphs of models. Artificial
Intelligence 51:145–177.
Allison, H. 1979. Inverse unstable problems and some of their applications.
Mathematical Scientist 4:9–30.
Amaral, L. A. N., and B. Uzzi. 2007. Complex systems: A new paradigm for the inte-
grative study of management, physical, and technological systems. Management
Science 53(7):1033–1035.
Banks, J., and J. S. Carson. 1984. Discrete-Event System Simulation. Englewood Cliffs,
NJ: Prentice-Hall.
Blöschl, G., and R. Kirnbauer. 1991. Point snowmelt models with different degrees of
complexity: Internal processes. Journal of Hydrology 129:127–147.
Boehm, B. W. 1981. Software Engineering Economics. Englewood Cliffs, NJ: Prentice-
Hall.
Bosserman, R. W. 1982. Structural comparison for four lake ecosystem models. In
A General Survey of Systems Methodology: Proceedings of the Twenty-sixth Annual
Meeting of the Society for General Systems Research, ed. L. Troncale, 559–568.
Washington, DC.
Brooks, R. J. 1996. A Framework for Choosing the Best Model in Mathematical Modelling
and Simulation. Ph D thesis, University of Birmingham, UK.
Brooks, R. J., D. N. Lerner, and A. M. Tobias. 1994. Determining a range of predic-
tions of a groundwater model which arise from alternative calibrations. Water
Resources Research 30(11):2993–3000.
Brooks, R. J., and S. Robinson. 2001. Simulation, with Inventory Control (author
C. Lewis), Operational Research Series. Basingstoke: Palgrave.
Brooks, R. J., M. A. Semenov, and P. D. Jamieson. 2001. Simplifying Sirius: Sensitivity
analysis and development of a meta-model for wheat yield prediction. European
Journal of Agronomy 14(1):43–60.
Brooks, R. J., and A. M. Tobias. 1999. Methods and Benefits of Simplification in
Simulation. In Proceedings of the U.K. Simulation Society (UKSIM 99), ed.
D. Al-Dabass and R. Cheng, 88–92. U.K. Simulation Society.
Brooks, R. J., and A. M. Tobias. 2000. Simplification in the simulation of manufactur-
ing systems. International Journal of Production Research 38(5):1009–1027.
Brooks, R. J., A. M. Tobias, and M. J. Lawrence. 1997a. A time series analysis of the
population genetics of the self-incompatibility polymorphism. 1. Allele fre-
quency distribution of a population with overlapping generations and variation
in plant size. Heredity 79:350–360.
54 Conceptual Modeling for Discrete-Event Simulation
Brooks, R. J., A. M. Tobias, and M. J. Lawrence. 1997b. A time series analysis of the
population genetics of the self-incompatibility polymorphism. 2. Frequency
equivalent population and the number of alleles that can be maintained in a
population. Heredity 79:361–364.
Bunge, M. 1963. The Myth of Simplicity: Problems of Scientific Philosophy. Englewood
Cliffs, NJ: Prentice-Hall.
Casti, J. L. 1979. Connectivity, Complexity, and Catastrophe in Large-Scale Systems.
New York: John Wiley and Sons.
Chaitin, G. J. 1975. Randomness and mathematical proof. Scientific American
232(May):47–52.
Costanza, R., and F. H. Sklar. 1985. Articulation, accuracy and effectiveness of
mathematical models: A review of freshwater wetland applications. Ecological
Modelling 27(1–2):45–68.
Courtois, P.-J. 1985. On time and space decomposition of complex structures.
Communications of the ACM 28(6):590–603.
Davis, P. K., and R. Hillestad. 1993. Families of models that cross levels of resolu-
tion: Issues for design, calibration and management. In Proceedings of the 1993
Winter Simulation Conference, ed. G. W. Evans, M. Mollaghasemi, E. C. Russell,
and W. E. Biles, 1003–1012. New York: IEEE.
Decoursey, D. G. 1992. Developing models with more detail: Do more algorithms give
more truth? Weed Technology 6(3):709–715.
Demarco, T. 1982. Controlling Software Projects: Management, Measurement and
Estimation. New York: Yourdon Press.
Durfee, W. K. 1993. Control of standing and gait using electrical stimulation: Influence
of muscle model complexity on control strategy. Progress in Brain Research
97:369–381.
Falkenheimer, B., and K. D. Forbus. 1991. Compositional modelling: Finding the right
model for the job. Artificial Intelligence 51:95–143.
Farmer, J. D. 1990. A rosetta stone for connectionism. Physica D 42:153–187.
Fenton, N. E. 1991. Software Metrics: A Rigorous Approach. London: Chapman and
Hall.
Fishwick, P. A. 1988. The role of process abstraction in simulation. IEEE Transactions
on Systems, Man and Cybernetics 18(1):19–39.
Flatau, M. 1995. Review Article: When order is no longer order—Organising and the
new science of complexity. Organization 2(3–4):566–575.
Flood, R. L., and E. R. Carson. 1993. Dealing with Complexity: An Introduction to the
Theory and Application of Systems Science, 2nd edition. New York: Plenum Press.
Fripp, J. 1985. How effective are models? Omega 13(1):19–28.
Garfinkel, D. 1984. Modelling of inherently complex biological systems: Problems,
strategies, and methods. Mathematical Biosciences 72(2):131–139.
Gass, S. I. 1983. What is a computer-based mathematical model? Mathematical
Modelling 4:467–472.
Gell-Mann, M. 1995. What is complexity? Complexity 1(1):16–19.
George, L. 1977. Tests for system complexity. International Journal of General Systems
3:253–258.
Golay, M. W., P. H. Seong, and V. P. Manno. 1989. A measure of the difficulty of sys-
tem diagnosis and its relationship to complexity. International Journal of General
Systems 16(1):1–23.
Complexity, Level of Detail, and Model Performance 55
Palsson, B. O., and I. Lee. 1993. Model complexity has a significant effect on the
numerical value and interpretation of metabolic sensitivity coefficients. Journal
of Theoretical Biology 161:299–315.
Pidd, M. 2004. Computer Simulation in Management Science, 5th edition. Chichester:
John Wiley and Sons.
Rexstad, E., and G. S. Innis. 1985. Model simplification: Three applications. Ecological
Modelling 27(1–2):1–13.
Robinson, S. 2008. Conceptual modeling for Simulation Part I: Definition and require-
ments. Journal of the Operational Research Society 59:278–290.
Robinson, S., and M. Pidd. 1998. Provider and customer expectations of successful
simulation projects. Journal of the Operational Research Society 49:200–209.
Rosen, R. 1977. Complexity as a system property. International Journal of General
Systems 3:227–232.
Salt, J. D. 1993. Keynote address: Simulation should be easy and fun! In Proceedings
of the 1993 Winter Simulation Conference, ed. G. W. Evans et al., 1–5. New York:
IEEE.
Schruben, L. 1983. Simulation modelling with event graphs. Communications of the
ACM 26(11):957–963.
Schruben, L., and E. Yücesan. 1993. Complexity of simulation models: A graph
theoretic approach. In Proceedings of the 1993 Winter Simulation Conference, ed.
G. W. Evans et al., 641–649. New York: IEEE.
Sevinc, S. 1990. Automation of simplification in discrete event modelling and simula-
tion. International Journal of General Systems 18(2):125–142.
Shannon, R. E. 1975. Systems Simulation: The Art and Science. Englewood Cliffs, NJ:
Prentice-Hall.
Simon, H. A. 1964. The architecture of complexity. General Systems Yearbook 10:63–76.
Smith, D. E., and J. M. Starkey. 1995. Effects of model complexity on the performance
of automated vehicle steering controllers: Model development, validation and
comparison. Vehicle System Dynamics 24:163–181.
Stockle, C. O. 1992. Canopy photosynthesis and transpiration estimates using radia-
tion interception models with different levels of detail. Ecological Modelling
60(1):31–44.
Tilanus, C. B. 1985. Failures and successes of quantitative methods in management.
European Journal of Operational Research 19:170–175.
Tobias, A. M. 1991. Verification, validation and experimentation with visual inter-
active simulation models. Operational Research Tutorial Papers, The Operational
Research Society.
Ward, S. C. 1989. Arguments for constructively simple models. Journal of Operational.
Research Society 40(2):141–153.
Webster, D. B., M. L. Padgett, G. S. Hines and D. L. Sirois. 1984. Determining the
level of detail in a simulation model: A case study. Computers and Industrial
Engineering 8(3–4):215–225.
Weaver, W. 1948. Science and complexity. American Scientist 36(Autumn):536–544.
Zeigler, B. P. 1976. Theory of Modelling and Simulation. New York: John Wiley.
Zeigler, B. P. 1979. Multilevel multiformalism modeling: An ecosystem example.
In Theoretical Systems Ecology: Advances and Case Studies, ed. E. Halfon, 17–54.
New York: Academic Press.
Zeigler, B. P. 1984. Multifacetted Modelling and Discrete Event Simulation. London:
Academic Press.
3
Improving the Understanding
of Conceptual Modeling
Contents
3.1 Introduction................................................................................................... 57
3.2 Study Objective............................................................................................. 59
3.3 Data Collection.............................................................................................. 59
3.3.1 Expert Project Data Collection........................................................ 59
3.3.2 Novice Projects Data Collection..................................................... 60
3.4 Results............................................................................................................ 62
3.4.1 Results for Expert.............................................................................. 62
3.4.2 Results for Novices...........................................................................64
3.4.3 Further Findings and Analysis....................................................... 66
3.5 Discussion and Conclusions....................................................................... 68
Acknowledgments................................................................................................. 69
References................................................................................................................ 69
3.1 Introduction
Conceptual modeling is a crucial stage of the simulation modeling process,
and yet it is poorly understood. Brooks and Robinson (2001) defined a con-
ceptual model is “a software independent description of the model that is to
be constructed.” Conceptual modeling therefore involves deciding the way
in which the virtual world of the simulation model should work (Section 2.1).
The conceptual model may be documented fully, such as in an annotated
system process flowchart, or it may only be documented partially, or even
not documented at all. In the absence of documentation, conceptual mod-
eling still takes places and the conceptual model comprises the combined
decisions of the project team in determining the way the model should work.
Conceptual modeling is a separate stage to model coding, which consists of
writing the computer code for the model (often using a simulation software
package). One aspect of conceptual modeling is deciding how much detail to
include in the model and Law (1991) considered that for simulation projects
57
58 Conceptual Modeling for Discrete-Event Simulation
The expert was asked to record the total number of hours spent each week
on different modeling topics. There was a desire to compare these results
with those of Willemain (1995) and so Willemain’s paper was used as a basis.
The expert preferred to use one of the alternative list of topics (from Hillier
and Lieberman 1967) given in Willemain as follows (with the matching topic
according to Willemain given in brackets): Formulating the problem (context),
constructing a mathematical model (structure), deriving a solution (realiza-
tion), testing the model and solution (assessment), establishing controls over
solution (implementation), and implementing the solution (implementation).
Each week, the expert recorded the number of hours spent on each of these
topics.
The expert modeler was also interviewed each week and asked whether
and how the conceptual model had changed during the week and, if there
had been a change, about the process and reasons for changing the model.
General issues, for instance, the main task of the week and whether working
on one topic influenced the others were also discussed.
Table 3.1
Systems Modeled in the Novice Projects
No.
Phase Coursea Projects Systems Modeled
1 UG 4 Food takeaway, post office, coffee shop,
Library book loan service points
1 PG 2 Restaurant, traffic crossing
2 UG 3 Convenience store, petrol station,
Library photocopiers
a UG = undergraduate, PG = postgraduate
whether the conceptual model had changed during the week. In this case,
a much more detailed list of topics was provided than the ones used by
Willemain (1995) so as to obtain more detailed data and to reduce the amount
of interpretation required by the students. In the subsequent analysis, the
topics were combined into our own preferred list of simulation tasks. The
topics were as follows (with the topic from our list in parentheses): identify
alternative potential projects (problem structuring), contact/interview with
the client (problem structuring), observe the system (problem structuring),
discuss with experts (problem structuring), set project objectives (problem
structuring), decide the model structure (conceptual modeling), model cod-
ing (model coding), collect data for the model (data collection and analysis),
parameter estimation and distribution fitting (data collection and analysis),
white box validation (verification and validation), black box validation (veri-
fication and validation), verification (verification and validation), experiment
with the model and analyze the result (experimentation), and report writing
(report writing).
The same data were collected in phase 2 but in an improved way. The
limitation of the method used in phase 1 is that the reliability of the data
depended on the accuracy of the students in recording the time spent and
also on how well they were able to match their tasks against the categories
provided. Also data were only recorded on a daily basis. To overcome these
drawbacks, in the phase 2 studies, the researcher (Wang Wang) sat in on
most of the student group meetings, observed their behavior and recorded
the relevant time herself in hourly intervals. Where group members con-
ducted individual work outside the meetings, they reported to the researcher
on what task they worked on and the time spent on that task. In addition, the
updated computer model was saved at the end of each group meeting so that
the changes to the model could be tracked. Collecting data in this way gives
more confidence in the reliability of the data. In both studies the hours were
not adjusted for the number of people doing each task because of the diffi-
culty in assessing the extra effort this represents. For example, two students
working together on coding the model for two hours was recorded as two
hours (rather than four).
62 Conceptual Modeling for Discrete-Event Simulation
3.4 Results
3.4.1 R esults for Expert
The analysis of the data follows some of Willemain’s analysis by calcu-
lating the relative weights of the different topics, and showing a graphi-
cal representation of the topics over time. Figures 3.1 and 3.2 show these
results for the expert project, while Figure 3.3 shows the average weight
given to each topic in the 24 sessions in Willemain’s experiment measured
in number of lines in the transcripts. As Figure 3.1 shows, the expert spent
most time on modeling and testing the model. No time was spent by the
70%
60%
Percentage of time
50%
40%
40%
30% 22%
18%
20%
9% 11%
10%
0%
0%
P (C) M (S) S (R) T (A) E (I) I (I)
Topic
Fig ur e 3.1
Proportion of time spent on each topic in the expert project. The topics are (with the matching
Willemain topic in parentheses): P (C) = Formulating the problem (context), M (S) = construct-
ing a mathematical model (structure), S (R) = deriving a solution (realization), T (A) = testing
the model and solution (assessment), E (I) = establishing controls over solution (implementa-
tion), I (I) = implementing the solution (implementation).
E (I)
Topic proportion
T (A)
S (R)
M (S)
P (C)
Fig ur e 3.2
Timeline plot for expert project. The topics are as in Figure 3.1. The data were collected weekly
over 10 weeks, which are shown by the vertical dashed lines.
Improving the Understanding of Conceptual Modeling 63
70%
59%
60%
Percentage of lines
50%
40%
30%
Fig ur e 3.3
Percentage of lines devoted to each topic in Willemain’s 24 experiments (Redrawn from
Willemain, T.R., Operations Research, 43(6), 916–932, 1995.)
expert implementing the solution since this was carried out subsequently
by the client.
The timeline plot (Figure 3.2) shows the topics worked on during the
project. The expert project data were obtained on a weekly basis over the
10-week period of the project. Only the total number of hours spent on
the topics in each week were recorded. Since the precise timings during the
week are not known the plot spreads the topics evenly during each week.
If more than one topic was worked on during the week then this is shown
by the bars not being full height in the plot (a full height bar would reach
the horizontal line above on the plot). For example, in the second week the
expert spent a total of 10 hours working on the project, which consisted of
6 hours on formulating the problem (P) and 4 hours on constructing the
model (M). This is shown in the plot by the heights of the bars for P and M
being, respectively, 60 and 40% of a full height bar for each hour in a 10-hour
period (hours 4–13). This data collection was less detailed than Willemain’s
data obtained in a laboratory setting, where the protocol recorded what
was happening all the time. One consequence is that where more than
one topic took place during the week then the order and the interaction
between the topics is not known. There could have been a lot of switching
between the topics during the week or, on the other hand, the topics could
have been worked on completely separately one after the other. This pre-
vented a detailed analysis of the switching between topics as carried out by
Willemain. Nevertheless, the topic plots still give useful information about
the positions and sequence of the topics throughout the project. In particu-
lar, the extensive overlap between the topics does indicate a considerable
amount of alternation between the topics rather than a linear process. In
general, the topics were in the anticipated order with topics higher up on
the y-axis expected to be later.
64 Conceptual Modeling for Discrete-Event Simulation
40%
33%
30%
Percentage of time
20%
16%
12% 13%
10%
10% 9%
7%
0%
PS CM DC MC VV EX RW
Topic
Fig ur e 3.4
Proportion of time spent on the topics in the six novice projects in phase 1. PS = problem struc-
turing, CM = conceptual modeling, DC = data collection, MC = model coding, VV = verifica-
tion and validation, EX = experimentation, RW = report writing.
Improving the Understanding of Conceptual Modeling 65
40%
32%
Percentage of time 30% 28%
20%
20%
10% 9%
7%
2% 3%
0%
PS CM DC MC VV EX RW
Topic
Fig ur e 3.5
Proportion of time spent on the topics in the three novice projects in phase 2. The topics are as
in Figure 3.4.
RW
EX
Topic proportion
VV
MC
DC
CM
PS
1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55
Time (hours)
Fig ur e 3.6
Timeline plot for one of the phase 2 novice projects. Topics are as in Figure 3.4.
the beginning. Another group had little understanding about warm-up and,
as a result, they had to go through the lecture notes first before they could
perform this task. Generally with these projects the groups have to collect
their own data, which is the reason for the high proportion of time on data
collection.
The timeline plot shown in Figure 3.6 is the same general format as
Figure 3.2. As already explained, most of the data for the phase 2 novice
projects was obtained by observation of the group by the researcher and was
recorded on an hourly basis. The overlapping topics in hours 1, 13, and 15 are
times when both topics were worked on during the hour. However, some of
the work was done individually by the group members and the total time
66 Conceptual Modeling for Discrete-Event Simulation
spent was just reported to the researcher. This data are therefore less detailed
with the precise interaction between the topics not known. The period from
hours 17 to 25 was not observed and instead the group members reported
spending 3 hours each on verification and validation, experimentation and
report writing. As with the expert plot (Figure 3.2), such data are shown by
spreading the topics evenly over the total period.
The pattern of the plot in Figure 3.6 is a fairly linear process. The novices
tended to complete one topic then move onto the next with not much overlap
and with very little returning to a previous topic. Most of the novice projects
were similar in this respect. Although the topic categories are different, this
is a quite different pattern to the expert (Figure 3.2) and also to the pattern in
Willemain’s (1995) experiments. Figure 3.2 shows much more overlap of top-
ics although, as previously explained, the precise pattern within each week
for the expert is not known. The overlap of the topics over several weeks
shows that there was more switching between topics for the expert than the
novices. Since there may also have been several iterations within each week
for the expert, this difference may be even more marked than is shown on
the graphs. Another comparison that can be made is that the expert started
model testing (verification and validation) much earlier than the novice
modelers who tended to leave it until after model coding was completed. As
for the expert project, the average position of the topics for the novices was
in the expected order with topics higher up the y-axis on Figure 3.6 expected
to be later.
40%
0%
PS CM DC MC VV EX RW
Topic
Fig ur e 3.7
Proportion of time spent on each topic in the expert project using revised allocation to the
same topics as the novices. Topics are as in Figure 3.4.
systems they modeled were so simple that these tasks were straightforward
to perform. To investigate this further, the novice project reports were stud-
ied. This showed poor performance on validation and verification with this
task receiving the lowest marks on average in the assessment of the reports.
For example, two groups failed to distinguish between white box validation
and verification and out of nine projects investigated, only two groups per-
formed black box validation properly. Sometimes this was due to a lack of
planning. For instance, two groups didn’t consider collecting data for valida-
tion when planning data collection.
The conceptual modeling process can be considered in more detail based
on the discussions with the expert and observations of the phase 2 novice
groups. The expert developed the conceptual model at the beginning of
the project and documented it in a system flow diagram, which guided the
construction of the computer model. However, the novice groups devoted
little time to understanding how the systems actually worked. They did
discuss the process of the system, but rarely drew a diagram. After iden-
tifying the project they tended to go straight to collecting data with little
prior planning or consideration of the model structure. As a result, some
of the data collected proved not to be useful. This is inefficient particu-
larly as data collection is time consuming. Sometimes further discussions
on the system process occurred at the model coding stage with concep-
tual modeling and coding taking place together. Some groups only docu-
mented the conceptual model at the end in order to include a diagram in
the report.
In the expert study and both novice studies, the subjects were asked to
note any changes in the conceptual model each week and the reason. For the
expert project, there was one significant conceptual model alteration toward
the end of the project. This involved a scope reduction due to the fact that
the collected data were not sufficient to support the model built. In one of the
novice groups, one student left in the middle of the project causing a change
68 Conceptual Modeling for Discrete-Event Simulation
that further research in this area will improve the success of simulation
and OR projects, and that the information can help in the training of novice
modelers.
Acknowledgments
This chapter is reproduced, with minor editing, from: Wang, W., and
R. J. Brooks. 2007. Improving the understanding of conceptual modelling.
Journal of Simulation 1(3): 153–158. © 2007 Operational Research Society Ltd.
Reproduced with permission of Palgrave Macmillan.
Some parts of this chapter are based on: Wang, W., and R.J. Brooks. 2007.
Empirical investigations of conceptual modeling and the modeling process.
In Proceedings of the 2007 Winter Simulation Conference, ed. S. G. Henderson,
B. Biller, M.-H. Hsieh, J. Shortle, J.D. Tew, and R.R. Barton, 762–770. Piscataway,
NJ: IEEE Computer Society Press.
References
Brooks, R. J. 2007. Conceptual modelling: Framework, principles, and future research.
Working paper no. 2007/011, Lancaster University Management School,
Lancaster, UK.
Brooks, R. J., and S. Robinson. 2001. Simulation, with Inventory Control (author
C. Lewis), Operational Research Series. Basingstoke: Palgrave.
Hillier, F. S., and G. J. Lieberman. 1967. Introduction to Operations Research.
San Francisco, Holden-Day.
Law, A. M. 1991. Simulation model’s level of detail determines effectiveness. Industrial
Engineering 23(10): 16–18.
Pidd, M. 2003. Tools for Thinking, Modelling in Management Science, 2nd edition.
Chichester: John Wiley and Sons.
Powell, S. G., and T. R. Willemain. 2007. How novices formulate models. Part I:
Qualitative insights and implications for teaching. Journal of the Operational
Research Society 58(8): 983–995.
Robinson, S. 1994. Simulation projects: Building the right conceptual-model. Industrial
Engineering 26(9): 34–36.
Salt, J. D. 1993. Keynote Address: Simulation should be Easy and Fun! In Proceedings
of the 1993 Winter Simulation Conference, ed. G. W. Evans et al., 1–5. New York:
IEEE.
Wang, W., and R. J. Brooks. 2007. Empirical investigations of conceptual modeling
and the modeling process. In Proceedings of the 2007 Winter Simulation Conference,
ed. S. G. Henderson, B. Biller, M.-H. Hsieh, J. Shortle, J.D. Tew, and R.R. Barton,
762–770. Piscataway, NJ: IEEE Computer Society Press.
70 Conceptual Modeling for Discrete-Event Simulation
Ward, S. C. 1989. Arguments for constructively simple models. Journal of the Operational
Research Society 40(2): 141–153.
Willemain, T. R. 1994. Insights on modeling from a dozen experts. Operations Research
42(2): 213–222.
Willemain, T. R. 1995. Model formulation: What experts think about and when.
Operations Research 43(6): 916–932.
Willemain, T. R., and S. G. Powell. 2007. How novices formulate models. Part II:
A quantitative description of behaviour. Journal of the Operational Research Society
58(10): 1271–1283.
Part II
Conceptual Modeling
Frameworks
4
A Framework for Simulation
Conceptual Modeling
Stewart Robinson
Contents
4.1 Introduction................................................................................................... 73
4.2 A Framework for Developing a Conceptual Model................................. 74
4.3 Understanding the Problem Situation....................................................... 76
4.4 Determining the Modeling Objectives...................................................... 79
4.5 Identifying the Model Outputs (Responses)............................................. 82
4.6 Identifying the Model Inputs (Experimental Factors).............................83
4.7 Determining the Model Content: Scope and Level of Detail.................85
4.7.1 Determining the Model Scope........................................................85
4.7.2 Determining the Model Level of Detail........................................ 88
4.8 Identifying Assumptions and Simplifications......................................... 93
4.9 Identifying Data Requirements.................................................................. 94
4.10 Model Assessment: Meets the Requirements of a
Conceptual Model?....................................................................................... 96
4.11 Conclusion..................................................................................................... 98
Acknowledgments................................................................................................. 99
References.............................................................................................................. 100
4.1 Introduction
Chapter 1 set out the foundations of conceptual modeling for simulation.
It provided an understanding of current thinking on the topic and gave a
definition of a conceptual model. It also discussed the requirements for a
conceptual model: validity, credibility, utility, and feasibility (Chapter 1,
Section 5). Such discussions are useful for informing a simulation modeling
project, but they do not answer the question of how to develop a conceptual
model. That is the question addressed in this chapter whose key contribution
is to provide a framework for developing conceptual models for simulations
of operations systems (Wild 2002). This is something that is largely missing
from the current literature on simulation.
73
74 Conceptual Modeling for Discrete-Event Simulation
Problem
situation
D
et
Modeling and
er rea
e
in
m so
or
general project
in n
m
er
ea sf
et objectives
ch or f
D
iev ai
em lur
en e
to
Model content:
f,
Experimental Accepts scope and
Provides
Responses
factors level of detail
Inputs Outputs
Conceptual model
Fig ur e 4.1
A framework for designing the conceptual model. (Adapted from Robinson, S., Simulation: The
Practice of Model Development and Use, Wiley, Chichester, UK, 2004. With permission.)
odeling consists of five key activities that are performed roughly in this
m
order:
In Chapter 1 (Section 2), the problem situation at the Ford Engine Assembly plant
is described. Two models were developed: one for determining the throughput
A Framework for Simulation Conceptual Modeling 79
of the plant, the other for investigating the scheduling of key components. In
order to illustrate the conceptual modeling framework, the development of a
conceptual model for the throughput problem is described. Details of the frame-
work as applied to the scheduling problem are available on request from the
author.
The reader is referred to the description of the problem situation at Ford in
Chapter 1. In this case there was a clear understanding of the problem among
the clients and domain experts; they were uncertain as to whether the required
throughput from the production facility as designed could be achieved.
The clients may not be able to provide a full set of objectives. This can
be the result of either their limited understanding of the problem situation,
or their limited understanding of simulation and what it can provide for
them. The latter might lead to the opposite problem, expecting too much
from the simulation work. Whichever, the modeler should spend time edu-
cating the client about the potential for simulation, what it can and cannot
do. The modeler should also be willing to suggest additional objectives
as well as to redefine and eliminate the objectives suggested by the cli-
ents. In this way the modeler is able to manage the expectations of the
clients, aiming to set them at a realistic level. Unfulfilled expectations are a
major source of dissatisfaction among clients in simulation modeling work
(Robinson 1998, Robinson and Pidd 1998).
As discussed above, the problem situation and the understanding of
it are not static. So too, the modeling objectives are subject to change.
Added to this, as the clients’ understanding of the potential of simula-
tion improves, as it inevitably does during the course of the study, so
their requirements and expectations will also change. This only adds to
the need for iteration between the activities in a simulation study, with
changes to the objectives affecting the design of the model, the experimen-
tation and the outcomes of the project. The two-way arrow in Figure 4.1
aims to signify the iteration between the problem situation and the mod-
eling objectives.
Figure 4.2 gives the modeling and general project objectives for the Ford through-
put model.
Organizational aim
The overall aim is to achieve a throughput of X units per day from the assembly line.
(Note: the value of X cannot be given for reasons of confidentiality.)
Modeling objectives
• To determine the number of platens required to achieve a throughput of X units per
day, or
• To identify the need for additional storage (and platens) required to achieve a
throughput of X units per day.
The second objective only needs be considered if throughput cannot be achieved by
increasing platens only.
General project objectives
• Time-scale: 30 working days.
• Flexibility: limited level required since extensive model changes beyond changes to
the data are not expected.
• Run-speed: many experiments may be required and so a reasonable run-speed is
important, but not at the forfeit of accuracy.
• Visual display: simple 2D animation. (The model is largely required for performing
experiments and obtaining results, communication through detailed graphics is not a
major need especially as the client is familiar with simulation. Therefore, the level of
visual display needs only to enable effective model testing and aid the diagnosis of
problems during experimentation.)
• Ease-of-use: simple interactive features will suffice since the model is for use by the
modeller.
Fig ur e 4.2
The Ford throughput model example: Modeling and general project objectives.
82 Conceptual Modeling for Discrete-Event Simulation
In the first case, the responses can normally be identified directly from
the statement of the modeling objectives. For example, if the objective is to
increase throughput, then it is obvious that one of the responses needs to be
the throughput. For the second case, identification is a little more difficult,
but appropriate responses can be identified by a mix of the modeler’s past
experience, the clients’ understanding and the knowledge of the domain
experts. Taking the throughput example, reports on machine and resource
utilization and buffer/work-in-progress levels at various points in the model
would be useful for helping to identify potential bottlenecks. Quade (1988)
provides a useful discussion on identifying appropriate measures for the
attainment of objectives.
Once the required responses have been identified, consideration should
also be given to how the information is reported; this might impact on
the required content of the model. Options are numerical data (e.g., mean,
maximum, minimum, standard deviation) or graphical reports (e.g., time-
series, bar charts, Gantt charts, pie charts). These can be determined through
consultation between the simulation modeler, clients and domain experts.
Consideration should also be given to the requirements for model use as
outlined in the general project objectives.
Figure 4.3 shows the responses identified for the Ford throughput model. Daily
throughput is selected as the response to determine the achievement of the
A Framework for Simulation Conceptual Modeling 83
Fig ur e 4.3
The Ford throughput model example: Responses.
Although the general expectation is that the clients will have control over
the experimental factors, this is not always the case. Sometimes, it is useful to
experiment with factors over which there is little or no control, for example,
the customer arrival rate. Such experimentation can aid understanding of
the system or help plan for future events.
Where the objective of the model is, at least in part, to improve understand-
ing, then the list of experimental factors may be a more subtle. The modeler
needs to determine, with the clients and domain experts, what factors might
be most useful to help improve understanding.
Apart from determining the experimental factors, it is useful to identify
the range over which the experimental factors might be varied (e.g., the mini-
mum and maximum number of staff on a shift). The simulation model can
then be designed to accept this range of values, potentially avoiding a more
complex model that allows for a much wider range of data input. Methods
of data entry should also be considered, including: direct through the model
code, model-based menus, data files or third party software (e.g., a spread-
sheet). The requirement depends upon the skills of the intended users of the
model and the general project objectives.
In the same way that the problem situation and modeling objectives are
not static, so the experimental factors and responses are subject to change
as a simulation study progresses. The realization that changing staff ros-
ters do not achieve the required level of performance may lead to the
identification of alternative proposals and, hence, new experimental fac-
tors. During experimentation, the need for additional reports may become
apparent. All this serves to emphasize the iterative nature of the modeling
process.
Figure 4.4 shows the experimental factors identified for the Ford through-
put model. Both of these factors are derived directly from the modeling
objectives.
Experimental factors
• The number of platens (maximum increase 100%)
• The size of the buffers (conveyors) between the operations (maximum increase
of 100%)
Fig ur e 4.4
The Ford throughput model example: Experimental factors.
A Framework for Simulation Conceptual Modeling 85
Unlike the first three components, resources are not modeled individually,
but simply as countable items. Some substitution is possible between using
resources and a more detailed approach using individual components. For
86 Conceptual Modeling for Discrete-Event Simulation
greater than the sum of the effects of removing them individually. Past expe-
rience will no doubt help in making such judgments. A cautious approach is
advised, keeping components in the model where there is some doubt over
their effect on validity.
Similarly, the effect on credibility also needs to be considered. It may be
that a component is not particularly important to the accuracy of the model,
but that its removal would damage the credibility of a model. In this case, it
should probably be included. Indeed, a wider scope (and more detail) may be
included in a model than is strictly necessary for validity, simply to increase
its credibility.
Consideration should be given to the issue of utility. The inclusion of a
component may significantly increase the complexity of a model or reduce
its run-speed. Both could reduce the utility of the model. The effect of each
component on feasibility should also be considered. It may be that the data
for modeling a component are unlikely to be available, or the complexity
of modeling a component would mean that the simulation study could not
meet its timescale.
A careful balance between validity, credibility, utility and feasibility must
be sought. For a component, where any one (or more) of these is seen as being
of vital importance, then it should be included in the model. If it appears that
a component is of little importance to any of these, then it can be excluded. In
performing Step 3 the model boundary may well become narrower as com-
ponents are excluded from the model. In Zeigler’s (1976) terms, Steps 1 and 2
are about identifying the base model (at least to the extent that it is known)
and Step 3 about moving to a lumped model.
In order to work through these three steps, a meeting or sequence of
meetings could be arranged between the modeler, clients and domain
experts. This is probably most effective in bringing the differing exper-
tise together rather than holding meetings with smaller groups or relying
on telephone or electronic media. Step 2 could consist of a brainstorming
session, in which all parties identify potential model components with-
out debate about the need, or otherwise, to include them. It is expected
that there will be a number of iterations between the three steps before the
model scope is agreed.
The discussions about the scope of the model need to be recorded to
ensure that there is agreement over the decisions that are being made. The
records also provide documentation for model development, validation, and
reuse. A simple table format for documenting the model scope is suggested
(see Table 4.1). The first column provides a list of all the components in the
model boundary (Steps 1 and 2). The second column records the decisions
from Step 3, and the third column describes the reasoning behind the deci-
sion to include or exclude each component. Having such a record provides
a representation around which the modeler, clients and domain experts
can debate and reach an accommodation of views on what should be in the
model scope.
88 Conceptual Modeling for Discrete-Event Simulation
Table 4.1
The Ford Throughput Model Example: Model Scope
Component Include/exclude Justification
Entities:
Engines Include Response: throughput of engines
Platens Include Experimental factor
Subcomponents Exclude Assume always available
Activities:
Line A Include Key influence on throughput
Head Line Include Key influence on throughput
Line B Include Key influence on throughput
Hot Test and Final Dress Exclude Limited impact on throughput as large buffer
between Line B and Hot Test
Queues:
Conveyors Include Experimental factor
Resources:
Operators Exclude Required for operation of manual processes,
but always present and provide a
standardized service. They cause no
significant variation in throughput.
Maintenance staff Include Required for repair of machines. A shortage of
staff would affect throughput
Table 4.1 shows the model scope for the Ford throughput model. This is shown
diagrammatically in Figure 4.5. The main opportunity for scope reduction comes
from the exclusion of the Hot Test and Final Dress areas.
Fig ur e 4.5
The Ford throughput model example: Model scope shown as the shaded area.
level of detail for each entity, activity, queue and resource to be included in
the model. Table 4.2 provides a list of details that could be considered for each
component type. This is not intended to be an exhaustive list, as indicated
by the “other” category, but it does provide a useful starting point; although
restraint should be used in defining “other” details to avoid unnecessarily
long lists of clearly irrelevant details. Again, the reader may be able to think
of additional details that could be listed against each component type. These
can simply be added to those listed in Table 4.2.
The modeler, clients and domain experts can work through the details in
Table 4.2 for each component in the model scope, determining whether the
detail should be included or excluded, and also deciding on how each detail
should be modeled. In a similar fashion to the model scope, the decision on
whether to include a detail or not should be guided by its perceived effect
on the validity, credibility, utility and feasibility of the model. These deci-
sions might be made at a meeting between the modeler, clients and domain
experts. Decisions about the level of detail can be made with reference to
these:
Prototyping (Powell 1995, Pidd 1999) is useful for reducing the judgmental
aspect of the decisions. In particular, the development of small computer
models to test ideas can aid decisions about the level of detail required for a
component. Indeed, prototyping can also aid decisions about model scope,
particularly through the use of high-level models in which sections of the
model can be sequentially included or excluded to determine their effect on
the responses.
90 Conceptual Modeling for Discrete-Event Simulation
Table 4.2
Template for Level of Detail by Component Type
Component Detail Description
Entities Quantity Batching of arrivals and limits to number of entities
Grouping so an entity represents more than one
item
Quantity produced
Arrival pattern How entities enter the model
Attributes Specific information required for each entity,
e.g., type or size
Routing Route through model dependent on entity type/
attributes, e.g., job shop routing
Other E.g., display style
Activities Quantity Number of the activity
Nature (X in Y out) E.g., representing assembly of entities
Cycle time
Breakdown/repair Nature and timing of breakdowns
Set-up/change-over Nature and timing of set-ups
Resources Resources required for the activity
Shifts Model working and break periods
Routing How entities are routed in and out of the activity
Other E.g., scheduling
Queues Quantity Number of the queue
Capacity Space available for entities
Dwell time Time entities must spend in the queue
Queue discipline Sequence of entities into and out of the queue
Breakdown/repair Nature and timing of breakdowns
Routing How entities are routed in and out of the queue
Other E.g., type of conveyor
Resources Quantity Number of the resource
Where required At which activities the resource is required
Shifts Working and break periods
Other E.g., skill levels, interruption to tasks
Table 4.3
The Ford Throughput Model Example: Model Level of Detail
Include/
Component Detail Exclude Justification
Entities:
Engines Quantity: produced. Model Include Response: throughput of
engines as an attribute of a engines
platen (full/empty) to
count engines produced
Arrival pattern Exclude Assume an engine block is
always available to be
loaded to the platen
Attribute: engine derivative Exclude No effect on machine cycles
and therefore no effect on
throughput
Routing Exclude Engines are only modeled as
an attribute of a platen
Platens Quantity: for Line A, Head Include Experimental factor
Line and Line B
Arrival pattern Exclude All platens are always present
on the assembly line
Attribute: full/empty Include Response: throughput of
Needed to count engines engines
produced as platen leaves
last operation on the line
Routing Exclude Routing determined by
process not platen
Activities:
Line A Quantity: quantity of Include Model individual machines as
machines for each each may have a significant
operation impact on throughput
Nature Exclude Subcomponents are not
modeled and so no assembly
is represented
Cycle time: fixed time Include Required for modeling
throughput. Assume no
variation in time for manual
processes.
Breakdown: time between Include Breakdowns are expected to
failure distribution have a significant impact on
throughput
Repair: repair time Include Breakdowns are expected to
distribution have a significant impact on
throughput
Set-up/change-over Exclude No set-ups in real facility
(Continued)
92 Conceptual Modeling for Discrete-Event Simulation
Queues:
Conveyors Quantity: 1 Include All conveyors are individual
Capacity Include Experimental factor
Dwell time: model as index Include Affects movement time and
time for platens so throughput
Queue discipline: FIFO Include Affects movement time and
so throughput
Breakdown/repair Exclude Failures are rare and so have
little effect on throughput
Routing: to next machine Include Routing of platens defines the
including routing logic to key interaction between
operations with more than system components
one machine
Type: accumulating Include Enables maximum utilization
conveyors of buffer space and so
improves throughput
Resources:
Maintenance Quantity Include Because there are fewer
staff maintenance staff than
machines, it is possible for
staff shortages to be a
bottleneck affecting
throughput
Where required: identify Include Required to allocate work to
machines that require maintenance staff
maintenance staff for repair
Shifts Exclude No work takes place outside
of on-shift time
Skill level Exclude Assume all staff can repair all
machines
A Framework for Simulation Conceptual Modeling 93
Table 4.3 shows the level of detail for the Ford throughput model. Note that an
operation is the type of activity, while a machine is the equipment that performs
that operation. There is more than one machine for some operations.
Modeling assumptions
• Capacity of the buffer before hot test and final dress is sufficient to cause minimal
blockage to the assembly line from downstream processes.
• Manual operators are always present for manual processes and provide a standardized
service.
• An engine block is always available to be loaded to a platen.
• No work is carried out during off-shift periods, therefore shifts do not need to be
modeled.
• Conveyor breakdowns are rare and so have little impact on throughput.
• All staff can repair all machines.
Fig ur e 4.6
The Ford throughput model example: Modeling assumptions.
Model simplifications
• Subcomponents are always available.
• No variation in time for manual processes.
Fig ur e 4.7
The Ford throughput model example: Model simplifications.
One issue that is not discussed here is how to select appropriate simplifica-
tions. The identification of opportunities for simplification is largely a matter
of the experience of the modeler, although discussion between the modeler,
clients and domain experts may also provide ideas for simplification. Beyond
this, it is useful to make reference to a standard set of simplifications. A range
of simplification methods exist, such as, aggregating model components,
replacing components with random variables and excluding infrequent events.
These have been the subject of a number of publications (Morris 1967, Zeigler
1976, Innis and Rexstad 1983, Courtois 1985, Ward 1989, Robinson 1994).
Figures 4.6 and 4.7 list the assumptions and simplifications for the Ford through-
put model.
data (Pidd 2003). Contextual data are required for understanding the prob-
lem situation and as an aid to forming the conceptual model (e.g., a layout
diagram of the operations system and preliminary data on service times).
Data for model realization can be directly identified from the level of detail
table. Data for validation (e.g., past performance statistics for the operations
system, if it currently exists) need to be considered in the light of the model
that is being developed and the availability of data for the real system. Here,
we shall only consider data for model realization.
It is a fairly straightforward task to identify the data for model realization
from the level of detail table. This can be done with reference to the com-
ponents and their details that are to be included in the model. These data
split into two types: the experimental factors (inputs) and model parameters.
Experimental factors are varied during experimentation but require initial
values. Parameters are data that remain unchanged during experimentation.
Identifying the data from the level of detail table supports the idea that the
model should drive the data and not vice versa (Pidd 1999).
Once the data for model realization are identified, responsibility for obtain-
ing the data should be allocated with clear direction over the time when the
data need to be available. Of course, some data may already be available,
other data may need to be collected and some may be neither available nor
collectable. Lack of data does not necessitate abandonment of the project.
Data can be estimated and sensitivity analysis can be performed to under-
stand the effect of inaccuracies in the data. Even where data are available or
can be collected, decisions need to be made about the sample size required
and care must be taken to ensure the data are sufficiently accurate and in the
right format. For a more detailed discussion on data collection see Robinson
(2004).
If data cannot be obtained, it may be possible to change the design of the
conceptual model so that these data are not required. Alternatively, the mod-
eling objectives could be changed such that an alternative conceptual model
is developed that does not require the data in question. During data col-
lection it is almost certain that various assumptions will have to be made
about the data; these assumptions should be recorded along with those iden-
tified from the conceptual model. This all serves to increase the iteration in
the modeling process, with the conceptual model defining the data that are
required and the availability of the data defining the conceptual model. In
practice, of course, the modeler, clients and domain experts are largely cog-
nizant of the data that are available when making decisions about the nature
of the conceptual model.
Figure 4.8 shows the data that are required for the Ford throughput model. These
have been identified from the details of the included components in the level of
detail table (Table 4.3).
96 Conceptual Modeling for Discrete-Event Simulation
Data requirements
• Planned quantity of platens on each assembly line
• Machines: quantity for each operation, cycle time, time between failure distribution,
repair time distribution, routing rules (e.g., percentage rework after a test station)
• Conveyors: capacity, index time for a platen, routing rules (e.g., split to parallel
machines)
• Maintenance staff: quantity, machines required to repair
Fig ur e 4.8
The Ford throughput model example: Data requirements for model realization.
the assumptions and simplifications. Ultimately the modeler and the clients
must have confidence in the conceptual model, reflected in the validity and
credibility of the conceptual model, respectively.
The utility of the conceptual model is “a perception, on behalf of the modeler
and the clients, that the conceptual model can be developed into a computer
model that is useful as an aid to decision-making within the specified con-
text” (Chapter 1, Section 5). Issues to consider are the ease-of-use, flexibility,
run-speed, visual display, and potential for model/component reuse. These
requirements are expressed through the general project objectives. All must
be of a sufficient level to satisfy the needs of the project. For instance, if the
model is to be used by the modeler for experimentation, then ease-of-use is of
less importance than if the model is to be used by the clients or a third party.
The final requirement, feasibility, is “a perception, on behalf of the modeler
and the clients, that the conceptual model can be developed into a computer
model with the time, resource and data available” (Chapter 1, Section 5). Can
the model be developed and used within the time available? Are the neces-
sary skills, data, hardware and software available? The modeler, clients and
domain experts need to discuss these issues and be satisfied that it is pos-
sible to develop and use the conceptual model as proposed.
It may be useful for the modeler to generate several conceptual model
descriptions and then to compare them for their validity, credibility, utility
and feasibility. The model that is perceived best across all four requirements
could then be selected for development.
All of the above is contingent on being able to express the conceptual model
in a manner that can be shared and understood by all parties involved in a
simulation study. In the terms of Nance (1994), this requires the expression of
the modeler’s mental conceptual model as a communicative model. The tables
derived in the conceptual modeling framework described above provide one
means for communicating the conceptual model; see Figures 4.2, through 4.4,
4.6, and 4.7 and Tables 4.1 and 4.3. Beyond this, diagrammatic representations
of the model are also useful (Figures 4.5 and 4.9), and possibly more beneficial
Fig ur e 4.9
An illustrative process flow diagram of part of the Ford throughput conceptual model.
98 Conceptual Modeling for Discrete-Event Simulation
4.11 Conclusion
The conceptual modeling framework described above provides a series of
iterative activities for helping a modeler to design a conceptual model for
a specific problem situation. Each activity is documented with a table sum-
marizing the decisions made. The use of these tables (along with diagram-
matic representations of the model), provides a means for communicating
and debating the conceptual model with the clients and domain experts. As
a result, it provides a route to agreeing upon the nature of the simulation
model that is required to intervene in the problem situation.
In conclusion, we consider the question of whether there is a right con-
ceptual model for any specified problem. For two reasons, the answer is
“no.” First, we have identified conceptual modeling as an art. Albeit that the
framework above provides some discipline to that art, different modelers
will not come to the same conclusions. Any other expectation would be akin
to expecting an art class to paint exactly the same picture of the same subject.
There has to be room for creativity in any art, including conceptual mod-
eling. There are, of course, better and worse conceptual models. The four
requirements of a conceptual model (validity, credibility, utility, and feasibil-
ity) provide a means for distinguishing better from worse.
A second reason why there is no right conceptual model is because the
model is an agreement between more than one person (the modeler, clients,
and domain experts). Each has his/her own preferences for and perceptions
of what is required. These preferences and perceptions are expressed through
A Framework for Simulation Conceptual Modeling 99
Acknowledgments
This chapter is reproduced, with minor editing, from: Robinson, S. 2008.
Conceptual modelling for simulation part II: A framework for conceptual
modelling. Journal of the Operational Research Society 59 (3): 291–304. © 2008
Operational Research Society Ltd. Reproduced with permission of Palgrave
Macmillan.
Some sections of this chapter are based on the following:
The Ford engine plant example is used with the permission of John Ladbrook,
Ford Motor Company.
100 Conceptual Modeling for Discrete-Event Simulation
References
Balci, O., and R.E. Nance. 1985. Formulated problem verification as an explicit require-
ment of model credibility. Simulation 45 (2): 76–86.
Balci, O., J.D. Arthur, and R.E. Nance. 2008. Accomplishing reuse with a simulation
conceptual model. In Proceedings of the 2008 Winter Simulation Conference, ed.
S.J. Mason, R.R. Hill, L. Mönch, O. Rose, T. Jefferson, and J.W. Fowler, 959–965.
Piscataway, NJ: IEEE.
Baldwin, L.P., T. Eldabi, and R.J. Paul. 2004. Simulation in healthcare management:
A soft approach (MAPIU). Simulation modelling practice and theory 12 (7–8):
541–557.
Checkland, P.B. 1981. Systems Thinking, Systems Practice. Chichester: Wiley.
Courtois, P.J. 1985. On time and space decomposition of complex structures.
Communications of the ACM 28 (6): 590–603.
Crapo, A.W., L.B. Waisel, W.A. Wallace, et al. 2000. Visualization and the process of mod-
eling: a cognitive-theoretic view. In Proceedings of the Sixth ACM SIGKDD Inter
national Conference on Knowledge Discovery and Data Mining, ed. R. Ramakrishnan,
S. Stolfo, R. Bayardo, and I. Parsa, 218–226. New York: ACM Press.
Eden, C., and F. Ackermann, F. 2001. SODA: The principles. In Rational analysis for a
problematic world revisited, 2nd edition, ed. J.V. Rosenhead and J. Mingers, 21–41.
Chichester: Wiley.
Ferguson, P., W.S. Humphrey, S. Khajenoori, et al. 1997. Results of applying the per-
sonal software process. Computer 30 (5): 24–31.
Hills, P.R. 1971. HOCUS. Egham, Surrey: P-E Group.
Hodges, J.S. 1991. Six (or so) things you can do with a bad model. Operations research
39 (3): 355–365.
Innis, G., and E. Rexstad. 1983. Simulation model simplification techniques. Simulation
41 (1): 7–15.
Kotiadis, K. 2007. Using soft systems methodology to determine the simulation study
objectives. Journal of simulation 1 (3): 215–222.
Lehaney, B., and R.J. Paul. 1996. The use of soft systems methodology in the develop-
ment of a simulation of out-patient services at Watford General Hospital. Journal
of the operational research society 47 (7): 864–870.
Little, J.D.C. 1994. Part 2: On model building. In Ethics in modeling, ed. W.A. Wallace,
167–182. Amsterdam: Elsevier (Pergamon).
Morris, W.T. 1967. On the art of modeling. Management science 13 (12): B707–717.
Nance, R.E. 1994. The conical methodology and the evolution of simulation model
development. Annals of operations research 53: 1–45.
Nance, R.E., and C.M. Overstreet. 1987. Diagnostic assistance using digraph repre-
sentation of discrete event simulation model specifications. Transactions of the
society for computer simulation 4 (1): 33–57.
Pidd, M. 1999. Just modeling through: a rough guide to modeling. Interfaces 29 (2):
118–132.
Pidd, M. 2003. Tools for Thinking: Modelling in management science, 2nd ed. Chichester:
Wiley.
Pidd, M. 2004. Computer Simulation in Management Science, 5th ed. Chichester: Wiley.
A Framework for Simulation Conceptual Modeling 101
Pooley, R.J. 1991. Towards a standard for hierarchical process oriented discrete event
diagrams. Transactions of the society for computer simulation 8 (1): 1–41.
Powell. S.G. 1995. Six key modeling heuristics. Interfaces 25 (4): 114–125.
Quade, E.S. 1988. Quantitative methods: uses and limitations. In Handbook of Systems
Analysis: Craft Issues and Procedures, H.J. Miser and E.S. Quade, 283–324.
Chichester: Wiley.
Richter, H., and L. März. 2000. Toward a standard process: The use of UML for design-
ing simulation models. In Proceedings of the 2000 Winter Simulation Conference,
ed. J.A. Joines, R.R. Barton, K. Kang, and P.A. Fishwick, 394–398. Piscataway,
NJ: IEEE.
Robinson, S. 1994. Simulation projects: Building the right conceptual model. Industrial
engineering 26 (9): 34–36.
Robinson, S. 1998. Measuring service quality in the process of delivering a simulation
study: the customer’s perspective. International transactions in operational research
5 (5): 357–374.
Robinson, S. 1999. Simulation verification, validation and confidence: a tutorial.
Transactions of the society for computer simulation international 16 (2): 63–69.
Robinson, S. 2001. Soft with a hard centre: Discrete-event simulation in facilitation.
Journal of the operational research society 52 (8): 905–915.
Robinson, S. 2002. General concepts of quality for discrete-event simulation. European
journal of operational research 138 (1): 103–117.
Robinson, S. 2004. Simulation: The Practice of Model Development and Use. Chichester,
UK: Wiley.
Robinson, S., and M. Pidd. 1998. Provider and customer expectations of successful
simulation projects. Journal of the operational research society 49 (3): 200–209.
Ryan, J., and Heavey, C. 2006. Requirements gathering for simulation. In Proceedings
of the Operational Research Society Simulation Workshop (SW06), ed. J. Garnett, S.C.
Brailsford, S. Robinson, and S.J.E. Taylor, 175–184. Birmingham: Operational
Research Society.
Schruben, L.W. 1983. Simulation modeling with event graphs. Communications of the
ACM 26 (11): 957–963.
Som, T.K., and R.G. Sargent. 1989. A formal development of event graphs as an aid to
structured and efficient simulation programs. ORSA journal on computing 1 (2):
107–125.
Sterman, J.D. 2000. Business Dynamics: Systems Thinking and Modeling for a Complex
World. New York: Irwin/McGraw-Hill.
van der Zee, D.J. 2006. Building communicative models: A job oriented approach
to manufacturing simulation. In Proceedings of the Operational Research Society
Simulation Workshop (SW06), ed. J. Garnett, S.C. Brailsford, S. Robinson, and
S.J.E. Taylor, 185–194. Birmingham: Operational Research Society.
Ward, S.C. 1989. Arguments for constructively simple models. Journal of the operational
research society 40 (2): 141–153.
Wild, R. 2002. Operations Management, 6th ed. London: Continuum.
Zeigler, B.P. 1976. Theory of Modeling and Simulation. New York: Wiley.
5
Developing Participative Simulation
Models: Framing Decomposition
Principles for Joint Understanding
Contents
5.1 Introduction................................................................................................. 104
5.2 Literature Review: Seeking Discipline in Model Creation................... 106
5.3 On the Construction of a Modeling Framework.................................... 109
5.3.1 Model and Experimental Frame................................................... 109
5.3.2 Modeling Framework..................................................................... 109
5.3.3 Decomposition Principles.............................................................. 110
5.3.3.1 I External and Internal Entities...................................... 110
5.3.3.2 II Movable and Nonmovable Entities............................ 110
5.3.3.3 III Queues and Servers.................................................... 111
5.3.3.4 IV Intelligent and Nonintelligent Entities.................... 111
5.3.3.5 V Infrastructure, Flows, and Jobs.................................. 111
5.3.3.6 VI Modality: Physical, Information, and
Control Elements.............................................................. 111
5.3.3.7 VII Dynamics: Executing Jobs........................................ 112
5.3.4 Engineering the Framework: Framing
Decomposition Principles.............................................................. 112
5.3.4.1 Main Classes and Their Hierarchies............................. 112
5.3.4.2 Class Definitions: Agents................................................ 115
5.3.4.3 Relationships between Agents....................................... 116
5.3.4.4 Dynamics Structure: Agents Executing Jobs............... 117
5.4 Applying the Modeling Framework: Enhancing Participation........... 117
5.4.1 Case: Repair Shop........................................................................... 118
5.4.1.1 Introduction...................................................................... 118
5.4.1.2 Objectives of the Study.................................................... 118
5.4.1.3 System Description.......................................................... 118
5.4.1.4 Conceptual Modeling...................................................... 119
5.4.1.5 Model Coding................................................................... 119
103
104 Conceptual Modeling for Discrete-Event Simulation
5.1 Introduction
A few decades ago, Hurrion introduced the notion of visual interactive
simulation (Hurrion 1976, 1989). Its basic contribution lies in the fact that it
brings analysts and stakeholders together by means of an animated display.
As such, it facilitates the joint discussion on model validation/verification,
and—maybe even more important—alternative and possibly better solu-
tions to the problem that the modeling and simulation project is meant to
solve (Bell and O’Keefe 1987, Bell et al. 1999). Refinement of the approach is
possible building on principles of object-oriented design (Booch 1994). Object
orientation was reembraced as a metaphor for simulation modeling in the
1990s, being developed for the early simulation language SimulaTM (Dahl
and Nygaard 1966). It foresees in a natural one-to-one mapping of real-world
concepts to modeling constructs (Glassey and Adiga 1990, Kreutzer 1993,
Roberts and Dessouky 1998).
Visual interaction and object orientation set rough guidelines for build-
ing a “conceptual” model for simulation (Balci 1986). Typically, a concep-
tual model is meant to facilitate the joint search for better-quality solutions,
building on a common understanding of the problem and system at hand.
This implies the conceptual model being both transparent and complete to
all parties involved in the study.
Clearly, model visualization greatly popularized the use of modeling and
simulation for systems design. This popularity makes clear that facilitating
stakeholders’ involvement in the simulation study is crucial to the acceptance
of modeling and simulation as a decision support tool. It may be expected
that such “facilitation” is even more important nowadays as systems design
often involves multiple problem owners as in, for example, supply chains,
health-care chains and transportation networks. Moreover, the complexity
of suchlike systems makes problem owners participation in the search for
better solutions indispensable, given their role as domain experts. The chal-
lenge for the analyst is to contribute to, and guide this process, aiming to
build mutual trust, and fostering the joint creation and acceptance of good
Developing Participative Simulation Models 105
Simulation tool
Diagramming technique
Model building
Modeling framework:
conceptual view on
a domain
Analyst
Stakeholders
Fig ur e 5.1
Modeling framework: Role in supporting simulation model development.
Second, and more specific, we aim to make the proposed modeling frame-
work for manufacturing simulation to an open architecture amendable for
improvements, extensions, and refinements.
(Pidd 1999, Law and Kelton 2000), domain related insights (Valentin and
Verbraeck 2005), and—last but not least—logic and libraries underlying sim-
ulation software (Kreutzer 1986). Let us consider these guidelines in some-
what more detail, starting from a recent survey of Robinson (2006, 2008).
Robinson distinguishes between three basic approaches on simulation
model development: principles of modeling, methods of simplification and
modeling frameworks. Here we characterize each of the approaches. For
related references please consult the work of Robinson. Principles of modeling
refer to the general case of conceptual modeling. Guiding principles include
the need for model simplicity, the advocated policy of incremental mode-
ling, and the good use of metaphors, analogies, and similarities in model
creation. Methods of simplification focus on the possibility of reducing model
scope and/or the level of detail for model elements, starting from their rel-
evance for model accuracy. Gains may, for example, be realized by combin-
ing model elements, leaving them out or adapting their attributes. Clearly,
these methods are helpful in model construction by pointing at possibilities
for model pruning. However, they do not address model creation in terms
of what is to be modeled. Modeling frameworks specify a procedural approach
in detailing a model in terms of its elements, their attributes and their rela-
tionships. Examples include the general case of systems representation and
domain related cases. The general case of systems representation foresees
in conceptualization building on elementary system elements, i.e., compo-
nents, including their variables and parameters, and mutual relationships,
see Shannon (1975). Such representations are reflected in basic diagramming
techniques for example, Petri Nets, Activity Cycle Diagrams (Pooley 1991),
and Event Graphs (Schruben 1983). Domain related cases refer primarily
to the military field, see Nance (1994). Outside this domain, examples are
scarce. Guru and Savory (2004) propose a framework for modeling physical
security systems. Also, our previous work on the modeling framework for
manufacturing systems (van der Zee and Van der Vorst 2005, van der Zee
2006a) may be included in this category.
Next to principles, methods and frameworks the analyst may be
guided—or restricted—in his conceptual modeling efforts by the simulation
software being adopted for the project. Pidd (1998) distinguishes between
several types of software ranging from general purpose languages to vis-
ual interactive modeling systems (VIMS). While the former category does
not provide a conceptual notion or basis for modeling, the latter category
is tailored toward simulation use assuming model building to be based on
an elaborate library of building blocks, which may be domain related to a
certain degree. Further we mention a specific class of simulation modeling
tools based on elementary concepts like, for example, DEVS (Zeigler 1990)
and Petri Nets (Murata 1989). Where VIMS offer contextual rich libraries,
these tools force the user to build models from a small set of elementary
components. However, where the logic underlying library set up for VIMS
may be found in a pragmatic and evolutionary path, libraries for tools like
108 Conceptual Modeling for Discrete-Event Simulation
5.3.2 Modeling Framework
The construction of the modeling framework foresees in two phases: (1) a
domain analysis resulting in a set of decomposition principles and (2) an
engineering process in which decomposition principles are “framed” in
terms of high-level class definitions for the field, see Figure 5.2.
Modeling framework
Framing
Decomposition principles
Analysis
Domain
Fig ur e 5.2
Construction of a modeling framework.
110 Conceptual Modeling for Discrete-Event Simulation
Table 5.1
Decomposition Principles Underlying the Modeling
Framework for Manufacturing Simulation
I External and internal entities (system boundary)
II Movable and nonmovable entities
III Queues and servers
IV Intelligent and nonintelligent entities
V Infrastructure, flows, and jobs
VI Modality: physical, information, and control elements
VII Dynamics: executing jobs
5.3.3 Decomposition Principles
The structure for a model concerns two types of elements (Ören and Zeigler
1979):
Table 5.2
Framing Decomposition Principles in Class Definitions, Class Hierarchies, and
Class Relationships
Agent
<A: Agent> <C: Job> Job
Internal External
agent agent
Input
Local
intelligence
Buffers
Primary flow items Transformer
Resources
Goods
Input Data Jobs Output
F(I|D) Generator
Output
Local
intelligence
<E: External agent –
customer>
Goods/resources/data
Annihilator
Input
F(I|D) Annihilator
Input
Local
<F: External agent – intelligence
supplier>
Goods/resources/data
Generator
Output
(Continued)
114 Conceptual Modeling for Discrete-Event Simulation
Internal
agent
Controller F(I|D)
Controller F(I|D)
F(C) Customer
F(C) Supplier
Internal
F(M)
agent Internal
F(M)
System agent
boundary System
boundary
their input, processing conditions and the agents to whom the resulting out-
put should be sent.
In a manufacturing system agents and flows are linked by jobs, which
describe manufacturing activities (V; C). In our job-oriented worldview, we
assume that each manufacturing activity is referred to as a job, being the respon-
sibility of a specific agent. In turn, a job concerns a comprehensive set of
activities, i.e., transformations linking a set of flow items and agent resources.
Note that a job definition is an intrinsic element of this set of flow items,
which influences both timing and characteristics of jobs, see above.
It is common practice to think of agents in terms of the type of flow items
that are the subject of their jobs. In line with practice it is possible to define
more specific classes of internal agents, where the type of flow item serves as
a parameter. For example, a workstation may be considered an internal agent
of a processor type handling goods. In a similar way control systems and
decision-makers may be defined as internal agents producing job definitions.
5.4.1.3 System Description
The activities of the company are driven by yacht owners that need for
preventive or corrective maintenance of engines. This is reflected in an
irregular pattern of engine arrivals at the shop. The initial activity at the
shop is the inspection of the engine to determine the need for replacement
parts. The replacement parts are ordered from the companies’ internal
warehouse (Figure 5.3). Inspection also serves to make a first estimate of
Repair station
Fig ur e 5.3
Repair shop: System description.
Developing Participative Simulation Models 119
the workload associated with an engine, i.e., the time needed to complete
repair activities. Estimates on workloads are reported to the shop planner.
The repair station consists of a number of identical and autonomous work
cells. Each cell is capable of repairing one engine at a time. A planner is
made responsible for assigning repair jobs to work cells. He makes a new
schedule on a weekly basis. For scheduling he currently applies a FCFS
policy. Work orders for new repair jobs are being released by the planner
to the shop at the start of each planning period and in response to feed
back of the repair shop on jobs being completed. An alternative scheduling
policy would be an SPT rule (SPT), whereas planning frequency may be
changed from weekly to daily.
5.4.1.4 Conceptual Modeling
As a first step in modeling we define a conceptual model for the repair
shop. For building the conceptual model we used the system description as
a starting point. In practice, also other sources of information may be rel-
evant, for example, visual observations or drafts of the system under study,
and domain knowledge. The conceptual model is set up according to the
high-level definitions developed in our manufacturing domain modeling
framework. Figure 5.4 displays agent definitions for two key sub systems,
the planner, and the repair station. Essentially, the planner is responsible
for two types of jobs: (1) the building of a schedule, and (2) the release of
job definitions. The first type of job is executed according to a prespecified
time interval, and uses messages reporting on arriving engines, the actual
schedule—as kept by the planner—and information on shop status as an
input. This is reflected in the definition of buffers. The second type of job
is triggered by the messages of the repair station reporting jobs’ comple-
tion. The repair station is responsible for a single type of job—the repair of
engines. The repair of an engine is allowed to start in a free work cell, if both
a job definition is received from the planner, and the engine identified in the
job definition is available.
5.4.1.5 Model Coding
For coding the repair shop a class library is built concerning the class defini-
tions for flow items, agents and jobs (Figure 5.5). These classes are the essential
building blocks for the aggregate class RepairShop. In more complex shops it
may be worthwhile to introduce more aggregate classes representing hierar-
chical levels in modeling. Such classes help to improve model overview. All
classes are built starting from the basic class library of EM-Plant®* that covers
the class definitions contained in the folders MaterialFlow, InformationFlow,
UserInterface, and MUs.
Planner
Buffer
Transformer
Control
queue
Local
intelligence
Transformer
Buffer
Engines
Input
engines
repair jobs Output
engines
messages on
job completion
Fig ur e 5.4
Conceptual model: Agents Planning and RepairStation.
Let us now discuss the implementation of the three main classes FlowItems,
Agents, and Jobs by giving a number of examples. Subsequently, we will
relate the classes and model dynamics by considering the internal structure
and workings of an agent:
Fig ur e 5.5
Class library and class repair shop.
Fig ur e 5.6
Coded model: Agents Planning and RepairStation.
Developing Participative Simulation Models 123
Model changes like the number of repair stations are facilitated by many
simulation languages by means of parameter settings of default build-
ing blocks. Implementation of the second and third scenario may imply
a somewhat greater appeal to the logic of modeling framework, as they
refer to nonstandard language features. Moreover, where the above sce-
narios do not directly involve the (control) structure for the system, oth-
ers may do so. This may involve the distribution of job classes over the
agents or the number of agents involved. For example, where the default
shop model assumes one agent to be responsible for both release and
scheduling of the repair station, in an alternative setting there may be
two specialized agents each responsible for one task. This separation of
tasks resembles different levels of shop control. In a similar way, tasks of,
for example, work stations may be redistributed. For a major part model
flexibility with respect to the representation of control is the net result of
a natural and explicit mapping of concepts—knowing where to look and
to make the change.
model structure and behavior, in terms of its basic elements and their work-
ings, which may make a difference in stakeholders’ model understanding
and their participation in decision support. This makes the identification and
studying of modeling frameworks and the underlying rules for model con-
struction, i.e., decomposition principles, worthwhile.
In this chapter we review a modeling framework for modeling manufac-
turing systems as we proposed it in our earlier work. So far this framework
has been presented and applied without highlighting the way it has been
constructed. In this chapter we do so for two reasons:
the use of the modeling framework for creating transparent models relative
to ad hoc approaches. Basically, ad hoc approaches may violate elementary
decomposition principles, like, for example, the separation of the control,
information and physical elements, and a well-defined notion of jobs and
their dynamic.
Some interesting directions for future research include the detailing
and deepening of the engineering approach underlying modeling frame-
works and their application. For example, the approach may be related to
the concept of reference models (see, for example, Biemans 1990 for the
manufacturing field), and principles of system engineering and software
engineering. Given the obtained insights in the engineering approach the
development of modeling frameworks for domains other than manufactur-
ing may be considered. Another direction may concern the use of the mod-
eling frameworks in facilitating simulation models, which assume higher
levels of user participation, such as gaming (van der Zee and Slomp 2005,
2009). Further, the notion of decomposition principles may be helpful in
setting up libraries of building blocks, or the validation of existing libraries
to support their conceptual renewal and improvement. Last but not least,
the modeling framework should be further validated by testing it on real-
world models.
Acknowledgment
This chapter is reproduced, with editing, from: van der Zee, D.J. 2007.
Developing participative simulation models: Framing decomposition prin-
ciples for joint understanding. Journal of Simulation, 1: 187–202. Reproduced
with permission from Palgrave.
Appendix Notation
For defining the modeling framework, we adopt the class diagrams as pro-
posed by Booch (1994). Class diagrams describe the class structure of a sys-
tem. They consist of two basic elements: object classes (for example, machines
and employees) and their mutual associations. Classes are described by their
name, attributes, and operations (Figure 5.A1).
Attributes are used to describe the state of the object belonging to a
class. A change of an attribute value corresponds to a state change for
the object. Operations refer to the services provided by a class. Operations
may change the state of an object (for example, withdraw one item from
Developing Participative Simulation Models 129
Relationships
Association
Using
Parameterization
Fig ur e 5.A1
Class notation.
References
Balci, O. 1986. Credibility assessment of simulation results. In Proceedings of the 1986
Winter Simulation Conference, 209–222. Piscataway, NJ: IEEE.
Bell, P.C., and R.M. O’Keefe. 1987. Visual Interactive Simulation: History, recent
developments, and major issues. Simulation 49(3): 109–116.
Bell, P.C., C.K. Anderson, D.S. Staples, and M. Elder. 1999. Decision-makers’ per-
ceptions of the value and impact of visual interactive modeling. Omega: The
International Journal of Management Science 27: 155–165.
130 Conceptual Modeling for Discrete-Event Simulation
Nance, R.E. 1994. The conical methodology and the evolution of simulation model
development. Annals of operations research 53: 1–45.
Ören, T.I., and B.P. Zeigler. 1979. Concepts for advanced simulation methodologies.
Simulation 32(3): 69–82.
Pegden, C.S., Shannon, R.E., and R.P. Sadowski. 1990. Introduction to Simulation Using
SIMAN. New York: McGraw-Hill.
Pratt, D.B., Farrington, P.A., Basnet, C.B., Bhuskute, H.C., Kanath, M., and J.H. Mize.
1994. The separation of physical, information, and control elements for facilitat-
ing reusability in simulation modelling. International journal of computer simula-
tion 4(3): 327–342.
Pidd, M. 1998. Computer Simulation in Management Science. Chichester: Wiley.
Pidd, M. 1999. Tools for Thinking: Modelling in Management Science,. 2nd edition.
Chichester: Wiley.
Pooley, R.J. 1991. Towards a standard for hierarchical process oriented discrete event
diagrams. Transactions of the society for computer simulation 8(1): 1–41.
Roberts, C.A., and Y.M. Dessouky. 1998. An overview of object-oriented simulation.
Simulation 70(6): 359–368.
Robinson, S. 2002. General concepts of quality for discrete-event simulation. European
journal of operational research 138: 103–117.
Robinson, S. 2006. Issues in conceptual modelling for simulation: Setting a research
agenda. In Proceedings of the Operational Research Society Simulation Workshop
(SW06), ed. J. Garnett, S. Brailsford, S. Robinson, and S. Taylor, 165–174.
Birmingham: Operational Research Society.
Robinson, S. 2008. Conceptual modelling for simulation Part I: Definition and require-
ments. Journal of the operational research society 59: 278–290.
Schruben, L.W. 1983. Simulation modeling with event graphs. Communications of the
ACM 26(11): 957–963.
Shannon, R.E. 1975. Systems Simulation: The Art and Science. Englewood Cliffs, NJ:
Prentice Hall.
Taylor, S.J.E., and S. Robinson. 2006. So where to next? A survey of the future for
discrete-event simulation. Journal of simulation 1(1): 1–6.
Valentin, E.C., and A. Verbraeck. 2005. Requirements for domain specific discrete
event simulation environments. In Proceedings of the 2005 Winter Simulation
Conference, ed. M.E. Kuhl, N.M. Steiger, F.B. Armstrong, and J.A. Joines, 654–663.
Piscataway, NJ: IEEE.
Van der Vorst, J.G.A.J., Tromp, S., and D.J. van der Zee. 2005. A simulation environment
for the redesign of food supply chain networks: Integrating quality and logistics
modeling. In Proceedings of the 2005 Winter Simulation Conference, ed. M.E. Kuhl,
N.M. Steiger, F.B. Armstrong, and J.A. Joines, 1658–1667. Piscataway, NJ: IEEE.
van der Zee, D.J. 1997. Simulation as a tool for logistics management, PhD thesis,
University of Twente, The Netherlands.
van der Zee, D.J. 2006a. Modeling decision making and control in manufacturing
simulation. International journal of production economics 100(1): 155–167.
van der Zee, D.J. 2006b. Building communicative models: A job oriented approach
to manufacturing simulation. In Proceedings of the Operational Research Society
Simulation Workshop (SW06), ed. J. Garnett, S. Brailsford, S. Robinson, and S.
Taylor, 185–194. Birmingham: Operational Research Society.
132 Conceptual Modeling for Discrete-Event Simulation
van der Zee, D.J. 2009. Building insightful simulation models using formal approaches:
A case study on Petri Nets. In Proceedings of the 2009 Winter Simulation Conference,
ed. M.D. Rossetti, R.R. Hill, B. Johansson, A. Dunkin, and R.G. Ingalls, 886–898.
Piscataway, NJ: IEEE.
van der Zee, D.J., and J. Slomp. 2005. Simulation and gaming as a support tool for
lean manufacturing systems: A case example from industry. In Proceedings of the
2005 Winter Simulation Conference, ed. M.E. Kuhl, N.M. Steiger, F.B. Armstrong,
and J.A. Joines, 2304–2313. Piscataway, NJ: IEEE.
van der Zee, D.J., and J. Slomp. 2009. Simulation as a tool for gaming and training in
operations management: A case study. Journal of simulation 3(1): 17–28.
van der Zee, D.J., Pool, A., and J. Wijngaard. 2008. Lean engineering for planning
systems redesign: Staff participation by simulation. In Proceedings of the 2008
Winter Simulation Conference, ed. S.J. Mason, R.R. Hill, L. Moench, and O. Rose,
722–730. Piscataway, NJ: IEEE.
van der Zee, D.J., and J.G.A.J. van der Vorst. 2005. A modeling framework for sup-
ply chain simulation: Opportunities for improved decision-making. Decision sci-
ences 36(1): 65–95.
Womack, K., Jones, D., and D. Roos. 1990. The Machine that Changed the World. Oxford:
Maxwell Macmillan International.
Zeigler, B.P. 1976. Theory of Modelling and Simulation. New York: Wiley.
Zeigler, B.P. 1990. Object-Oriented Simulation with Hierarchical, Modular Models,
Intelligent Agents and Endomorphic Systems. London: Academic Press.
6
The ABCmod Conceptual
Modeling Framework
Contents
6.1 Introduction................................................................................................. 134
6.2 Overview and Related Work..................................................................... 135
6.3 Constituents of the ABCmod Framework............................................... 138
6.3.1 Overview.......................................................................................... 138
6.3.2 Exploring Structural and Behavioral Requirements................. 138
6.3.3 Model Structure.............................................................................. 143
6.3.3.1 Entity Structures and Entities........................................ 143
6.3.3.2 Identifiers for Entity Structures and Entities............... 145
6.3.3.3 Attributes.......................................................................... 146
6.3.3.4 State Variables................................................................... 149
6.3.4 Model Behavior............................................................................... 150
6.3.4.1 Activity Constructs.......................................................... 150
6.3.4.2 Action Constructs............................................................ 155
6.3.5 Input.................................................................................................. 157
6.3.6 Output.............................................................................................. 159
6.3.7 Data Modules.................................................................................. 162
6.3.8 Standard Modules and User-Defined Modules......................... 162
6.4 Methodology for Developing an ABCmod Conceptual Model........... 164
6.5 Example Project: The Bigtown Garage.................................................... 165
6.5.1 SUI Key Features............................................................................. 166
6.5.1.1 SUI Overview................................................................... 166
6.5.1.2 General Project Goals...................................................... 166
6.5.1.3 SUI Details........................................................................ 166
6.5.1.4 Detailed Goals and Output............................................ 167
6.5.2 ABCmod Conceptual Model......................................................... 168
6.5.2.1 High-Level Conceptual Model....................................... 168
6.5.2.2 Detailed Conceptual Model........................................... 170
6.6 Conclusions.................................................................................................. 177
References.............................................................................................................. 178
133
134 Conceptual Modeling for Discrete-Event Simulation
6.1 Introduction
The development of a meaningful conceptual model is an essential phase
for the successful completion of any modeling and simulation project. Such
a model serves as a crucial bridge between the generalities of the project
description and the precision required for the development of the simulation
program that ultimately generates the data that is required for resolving the
project goals. A conceptual model is a careful blending of abstraction and
pertinent detail.
In the realm of continuous time dynamic systems, conceptual model devel-
opment typically relies on the language of differential equations, which is
usually colored by the terminology that is specific to the domain in which the
underlying dynamic system is embedded (e.g., engineering, thermodynamics,
aerodynamics, etc.). However when the system under investigation (SUI) falls
in the realm of discrete-event dynamic systems (DEDS) there is, regrettably,
no equivalent language that can adequately characterize behavior because of
the diversity and complexity that pervades this domain. The most straight-
forward means for conceptual modeling is therefore absent. The typical con-
sequence, regrettably, is a leap directly into the intricacies of some computer
programming environment with the unfortunate result that the program dis-
places the model as the object of discourse. Essential features of the model
quickly become obscured by the intricacies of the programming environment.
Furthermore, the resulting artefact (i.e., the simulation program) has minimal
value if a change in the programming environment becomes necessary.
In this chapter we outline the ABCmod conceptual modeling framework
(Activity-Based Conceptual modeling), which is an environment for devel-
oping conceptual models for modeling and simulation projects in the DEDS
domain. Its model building artefacts fall into two categories; namely, entity
structures and behavior constructs. These relate, respectively, to the structural
and the behavioral facets of the SUI. Care has been taken to ensure that all
aspects of the modeling requirements are included in a consistent and trans-
parent manner. In addition to structure and behavior, the framework includes
a consistent means for characterizing the inputs and the outputs of the SUI that
have relevance to the project goals. The conceptual modeling process within
this framework is guided by an underlying discipline but the overall thrust
is one of informality and intuitive appeal. The constituents of the framework
can be easily extended on an ad hoc basis when specialized needs arise.
The underlying concepts of ABCmod framework have been continuously
evolving and earlier versions of this environment have been presented in
the literature (Arbez and Birta 2007, Birta and Arbez 2007). The presentation
in this chapter incorporates several important refinements. Included here is
a clearer and more coherent separation between structural and behavioral
aspects of the model, as well as an approach for presenting the model at
both a high level of abstraction in addition to a detailed level. The latter
The ABCmod Conceptual Modeling Framework 135
We refer to this view of an activity as the inclusive view. The activity construct
in the ABCmod framework encapsulates the inclusive view. This perspective
136 Conceptual Modeling for Discrete-Event Simulation
The inclusive activity notion has not been especially useful from the point of
view of simulation engine design where the focus is on logic flow, specifically
the management of lists of events that must take place in a correct temporal
sequence. Strategies have, nevertheless, evolved from this underlying notion.
The two most common are the activity-scanning approach (often called the
two-phase approach [Buxton and Laski 1962]) and an extension called the
three-phase approach (Tocher 1963). In both these approaches the four facets
of the inclusive view of an activity are implicitly recognized but are sepa-
rated and reconstructed into alternate constructs that are more useful from
a software perspective. Nevertheless the word “activity” is usually retained
but its meaning is often unclear and/or curious (e.g., it is not uncommon to
read that “an activity is an event that ”).
We note finally that a correspondence might be assumed between our
behavior diagram (see (iii) above) and the Activity Cycle Diagram (ACD)
that can be found in the modeling and simulation literature (e.g., Kreutzer
1986, Pidd 2004a). Both of these diagrams are formed from a collection of
life-cycle diagrams that are specific to an entity class. A simple example of
an ABCmod life-cycle diagram is given in Figure 6.1, which is intended to
show that an entity associated with this life-cycle diagram could flow either
to activity Act2 or activity Act3 upon completion of its engagement in activ-
ity Act1.
The rectangles in our life-cycle diagram represent activities (more cor-
rectly activity instances) as per the inclusive view outlined earlier. In the
ACD context, rectangles are often called “active states” (Kreutzer 1986, Pidd,
2004a) and at best (depending on the author), they encompass only parts (b)
and (c) of the inclusive activity’s four constituents. The circle in Figure 6.1
is intended simply to represent a delay (of uncertain length) encountered
by an entity instance that arises because the conditions for initialization of
a subsequent activity instance in which it will be engaged may not yet be
present. In the ACD context, the circle (usually called a “dead state” [Pidd
2004a]) often corresponds to a queue. Furthermore. there is frequently an
implicit suggestion that the ACD reflects structural properties of the SUI.
There is no structural implication associated with the ABCmod behavior
diagram.
Act1
Act2 Act3
Fig ur e 6.1
ABCmod life-cycle diagram.
138 Conceptual Modeling for Discrete-Event Simulation
Merchandise area 1
(resource group)
Dept. 1
Service
desk 1
(resource)
Customer line 1 (queue)
Service
Merchandise area 2 desk 2
(resource group) Customer line 2 (queue) (resource)
Dept. 2
Merchandise area 3
Customer line 3 (queue) (resource group)
Service
desk 3
(resource)
Dept. 3
Customers
Fig ur e 6.2
A schematic view of the conceptual model for department store shoppers. (Based on Birta, L.G.
and Arbez, G., Modeling and Simulation: Exploring Dynamic System Behavior, Springer, London,
Fig. 4.1, p. 99, 2007. With kind permission of Springer Science and Business Media.)
ii. The service activity carries out a purposeful task and has duration;
i.e., it extends over an interval of time.
iii. Changes take place when the service function is completed (e.g.,
at time A3 = C2 the number of shoppers in merchandize Area 3
142
Customer A
Waiting in
queue 1 Shopping
in area 3
Service Service
Shopping
at at
in area 1
desk 1 desk 3
A0 A1 A2 A3 A4 A5 t
Customer B
Waiting in
Inter-arrival queue 3
time 1
Service Service Service
Shopping Shopping Shopping
at at at
in area 1 in area 3 in area 2
desk 1 desk 3 desk 2
B0 B1 B2 B3 B4 B5 B6 B7
t
Customer C
Inter-arrival
time 2 Service
Shopping Waiting in
at
in area 1 queue 1
desk 1
C0 C1 C2 C3
t
Fig ur e 6.3
Behavior of three department store shoppers. (Based on Birta, L.G., and Arbez, G., Modeling and Simulation: Exploring Dynamic System Behavior, Springer,
London, Fig. 4.2, p. 107, 2007. With kind permission of Springer Science and Business Media.)
Conceptual Modeling for Discrete-Event Simulation
The ABCmod Conceptual Modeling Framework 143
These features are directly reflected in one of the main ABCmod constructs
used to characterize behavior. This will become apparent in the discussion
of Section 6.3.4 below.
From the perspective of the requirements of a modeling and simulation
project, the description given above for the department store shoppers is
incomplete in several respects. Many details need to be provided; for exam-
ple, how is the set of merchandise areas that a particular customer visits
selected? What is the order of the visitations? And how many servers are
assigned to the service desks? Can a particular customer balk; i.e., not make
any purchase at one or more of the assigned merchandise areas and if so,
then under what circumstances? The information for dealing with these
questions is not provided in the descriptive fragment that is given but would
most certainly be necessary before a meaningful conceptual model could be
formulated. Indeed one of the important functions of the conceptual model-
ing process is to reveal the absence of such essential details.
Likewise several data models need to be determined. Included here would
be the characterization of customer arrival rates and service times at the
service desks, allocation of the shopping areas to be visited by the arriving
customers and the characterization of the duration of the browsing phase at
each merchandise area, etc. It is especially important to observe that these
various data models will provide the basis for generating events that give
rise to change. For example, the event associated with the end of a particular
customer’s browsing phase will generally (but not necessarily) result in that
customer’s relocation into the queue associated with the service desk of that
service area.
The intent of the discussion in this section has been to explore some of
the important facets of the modeling process within the DEDS domain, at
least insofar as they are reflected in the particular problem context that was
considered. This overview will serve as a foundation for the discussion that
follows.
entity has its origins in the more fundamental notion of an entity structure.
In this section we outline the relationship between these two notions and
explore their important features. A particularly significant outcome is insight
that emerges about the state variables of an ABCmod conceptual model.
Each of the entity structures within an ABCmod conceptual model serves
as a specification for one or more entities. Such a specification is an m-tuple
of attribute names together with a description of each attribute. An entity is
a named m-tuple of values where the name is derived from the underlying
entity structure and the values are assignments to the attributes of that entity
structure. Such an entity is said to be derived from the entity structure.
It follows then that one of the important initial steps in the development of
a conceptual model for any particular modeling and simulation project is the
identification of an appropriate collection of such entity structures; i.e., one
that accommodates the modeling requirements of the project. This collection,
in effect, defines the structure of the conceptual model being constructed.
As will become apparent in Section 6.3.4, the entities that are derived from
these entity structures are fundamental in behavior specification.
Each entity structure has two properties, which are called role and scope.
The notion of role is intended simply to provide a suggestive (i.e., intuitive)
link between the features of the SUI and the conceptual model building
environment provided by the ABCmod framework. The value assigned to
role reflects the model builder’s view of the entity structure in question, or
more correctly, the entity (or entities) that are derived from that entity struc-
ture. There are four basic alternatives (i.e., values for role) that align with a
wide variety of circumstances; namely the following:
There is no reason to believe, however, that these four alternatives will nec-
essarily encompass all possible circumstances. Note, for example, that it is
often the case that an entity structure’s role may exhibit duality. Consider,
The ABCmod Conceptual Modeling Framework 145
appends type information to a name for the entity structure that is meaning-
ful to the model builder. The general format for this identifier is:
Type: Name
For example, the entity structure identifier: “Resource Set[2]: Tugboat” would
indicate a resource entity structure called Tugboat from which two entities are
derived. Alternately, “Resource Unary: Tugboat” indicates a Resource entity
structure (called Tugboat) from which a single entity is derived. Continuing
with this example, we shall frequently use phrases such as “a Tugboat entity”
to imply an entity derived from an entity structure called Tugboat. Note
however that this reference does not reveal the role or scope of the underlying
entity structure.
The identifier for an entity has a format that reflects the properties of the
underlying entity structure. For the case where scope = Unary the unique
entity derived from the entity structure has the identifier X.Name where X is
one of R, C, Q, G (or some combination of these alternatives) depending on
the value of role; i.e., X = R if role = Resource, X = C if role = Consumer, X = RG
if role = Resource Group, etc. and Name is the name assigned to the underly-
ing entity structure. When the underlying entity structure has scope = Set[N],
we use X.Name[j] where j (0 ≤ j ≤ N−1) designates the jth entity derived from
the entity structure. When scope = Class, iX.Name is simply a reference to
some particular instance of the entity structure that is relevant to the context
under consideration. It does not serve as a unique identifier.
We note finally that within the ABCmod framework entity identifiers are
regarded as having global scope. This means that they can be referenced
from all behavior construct instances.
6.3.3.3 Attributes
The identification of appropriate attributes for an entity is governed to a
large extent by the requirements that emerge in the process of characterizing
behavior. This characterization, within the ABCmod framework, is carried
out using a collection of “behavior constructs” that react to and manipulate
* In section 6.3.3.1 it was pointed out that there are situations where role may have either sequential
or simultaneous duality. The composite value indicated here accommodates such a possibility.
In principle, the form could have three components and the intent would be analogous.
The ABCmod Conceptual Modeling Framework 147
entities. Inasmuch as entities reflect attribute values, it follows that the selec-
tion of the attributes themselves (in the formulation of the underlying entity
structures) is the fundamental issue. Some important insight about the selec-
tion of appropriate attributes can be obtained by examining typical attribute
requirements for several entity categories.
We begin with an examination of consumer entity instances (cei’s); i.e., enti-
ties derived from an entity structure with role = Consumer and scope = Class.
In many circumstances, such cei’s can be viewed as flowing among the vari-
ous aggregate entities (Queue entities and Group entities) and the Resource
entities that exist within the model. An essential requirement therefore is to
track both the existence and the status of these entities to ensure that they
can be processed correctly by the rules that govern the model’s behavior. In
addition, there may be a particular trail of data produced by the cei’s that
is relevant to the output requirements that are implicit in the project goals.
These various requirements suggest typical attributes for entity structures
having scope = Class.
For example, the cei’s derived from a particular entity structure may have
properties or features that have direct relevance to the manner in which they
are treated by the rules of behavior. In this regard a possible attribute for
the entity structure could be “Size,” which may have a one of three values
(SMALL, MEDIUM, or LARGE) or alternately, “Priority,” which may have
one of two values (HIGH or LOW).
Observe also that output requirements arising from the project goals often
need data that must be collected about the way that cei’s have progressed
through the model. Frequently this requirement is for some type of elapsed
time measurement. For example, it may be required to determine the aver-
age time spent waiting for service at a particular resource entity by cei’s that
utilize that resource. An attribute introduced for this purpose could function
as a time stamp storing the value of time, t, when the waiting period begins.
A data value placed in a prescribed data set would then be computed as the
difference between the value of time when the waiting period ends and the
time stamp.
As previously suggested, a perspective that is frequently appropriate is
one where cei’s flow from Resource entity to Resource entity accessing the
services that are provided by them. At any particular point in time, however,
access to a particular Resource entity may not be possible because it is already
engaged (busy) or is otherwise not available (e.g., out of service because of a
temporary failure). Such circumstances are normally handled by connecting
the entity to an aggregate entity that is associated with the Resource entity
where they can wait until access to the Resource entity becomes possible.
The most common aggregate entity is a Queue entity (i.e., an entity derived
from an entity structure for which role = Queue). Connecting a cei to a Queue
entity corresponds to placing the cei in that Queue entity. From this observa-
tion it is reasonable to suggest two particular attributes for any Queue entity
structure within the model; namely, List and N. Here List serves to store the
148 Conceptual Modeling for Discrete-Event Simulation
cei’s that are enqueued in a Queue entity derived from that Queue entity
structure and N is the number of entries in that list.
It needs to be stressed that the above selection of attributes for character-
izing a Queue entity structure is intended simply to be suggestive and is not
necessarily adequate for all situations. In some cases, for example, it may
be appropriate to include an attribute that permits referencing the specific
Resource entity with which an entity (or entities) derived from the Queue
entity structure are associated.
The characterization of a Group entity structure is similar to that of a
Queue entity structure but there is an important difference. Consumer entity
instances are often placed into a Group entity as in the case of a Queue entity,
however there is no intrinsic ordering discipline. On the basis of the obser-
vations above, the attributes for a Group entity structure could reasonably
include List and N where List is the list of the cei’s connected to the Group
entity and N is the number of entries in that list. In some situations it may
be useful to include an attribute that allows some variation in the capacity
of the Group entity. This is very much context dependent and provides a
further illustration of the need to tailor the characterizing attributes of entity
structures to the specific requirements of a project.
Consider now a Resource entity. One perspective that could be taken is to
regard a cei that is being serviced by a Resource entity as being incorporated
into it. To support such a perspective the underlying Resource entity struc-
ture would have to have an attribute for this purpose (possibly called: Client).
In many circumstances it is relevant to have an attribute that reflects the sta-
tus of an entity that is derived from an underlying Resource entity structure.
Such an attribute might, for example, be called Busy where the implication is
that the assigned binary value indicates whether or not the Resource entity is
busy, i.e., is carrying out its intended function. When the status of a Resource
entity may assume more than two values, it may be convenient to introduce
an attribute called Status that can acquire these multiple values. For example,
Status could assume the values IDLE, BUSY, or BROKEN.
A tabular format is used for the specification of all entity structures in an
ABCmod conceptual model. The template for this specification is given in
Table 6.1 where Type is: {role} {scope} as outlined earlier.
As will become apparent in section 6.3.4, the behavior constructs that cap-
ture the behavior of the SUI react to and modify the attribute values that are
encapsulated in entities. A means for referencing these values is therefore
essential. Our convention in this regard is closely related to the convention
described above for identifying entity structures. In particular, the conven-
tion endeavors to clearly reveal the entity structure from which the entity in
question is derived. Consider an entity structure with scope = Class. By our
previously outlined convention, the identifier for an entity instance derived
from this entity structure has the generic form:
iX.Name
The ABCmod Conceptual Modeling Framework 149
Table 6.1
Template for Specifying an Entity Structure
Type: Name
A description of the entity structure called Name
Attributes Description
AttributeName1 Description of the attribute called AttributeName1
AttributeName2 Description of the attribute called AttributeName2
. .
. .
AtributeNamen. Description of the attribute called AttributeNamen
where X is the value of role and is one of (R, C, Q, G, YZ), where each of Y
and Z can assume one of (R, C, Q, G) and Y ≠ Z. If Attr is an attribute of this
entity, then we use
iX.Name.Attr
X.Name[j]
X.Name[j].Attr
Table 6.2
Template for an Activity
Activity: Name
A description of the Activity called Name
Precondition Boolean expression that specifies the condition for initiation
Event SCS associated with initiation
Duration The duration (typically acquired from a Data Module)
Event SCS associated with termination
Table 6.3
Template for the Triggered Activity
Triggered Activity: Name
A description of the Triggered Activity called Name
Event SCS associated with initiation
Duration The duration (typically acquired from a Data Module)
Event SCS associated with termination
when it may be closely related. One such situation occurs when one behavior
unit directly follows upon completion of another without the need to “seize”
a further resource. Our notion of a Triggered Activity provides the means for
handling such situations.
As an example, consider a port where a tugboat is required to move a
freighter from the harbor entrance to an available berth where a loading
(or unloading) operation can immediately begin. Here the berthing and the
loading operations each map onto activity constructs but the latter is distinc-
tive because the required resource (i.e., the berth) is already available when
the berthing is completed and hence the loading can immediately begin. It
is because of this absence of a precondition that the loading operation maps
onto a Triggered Activity in our ABCmod framework.
Triggered Activity: The distinguishing feature of a Triggered Activity is that
its initiation is not established by a precondition but rather by an explicit
reference to it within the terminating event of some other activity construct.
Such a reference has the form: TA.Name where the “TA” prefix emphasizes
that Name is a reference to a Triggered Activity. Note that this shows that an
SCS can be more than simply a collection of specifications for state variable
changes inasmuch as it can also include a reference to a particular Triggered
Activity, which, in turn, serves to initiate an instance of that construct. The
template for the Triggered Activity is given in Table 6.3.
We have previously indicated that an activity construct encapsulates a unit
of behavior within the SUI. The flow of this behavior, in the context of a
specific activity instance may, however, be subjected to an intervention that
disrupts the manner in which behavior unfolds. Such an intervention can
154 Conceptual Modeling for Discrete-Event Simulation
have a variety of possible effects; for example, (a) the initial (tentative) dura-
tion of the activity instance may be altered, (b) the duration may no longer
map onto a continuous time interval but may instead map onto two or more
disjoint intervals, possibly in combination with (a), (c) the behavior intrinsic
to the activity instance may be stopped and may never be resumed.
Two possible types of intervention are possible; namely, preemption and
interruption. We examine each of these in turn. Preemption typically occurs
in a situation where two (or more) activity instances require the same resource
that cannot be shared. Consider for example the circumstance where the
initiation of one activity instance called ActP disrupts the flow of another
activity instance called ActQ because a resource that is required by both
activities instances must be taken from ActQ and reassigned to ActP because
ActP has higher priority access to the resource. The ABCmod presentation
of such a circumstance requires that ActQ be formulated as an Extended
Activity (see Table 6.4) with a preemption subsegment within its Duration
segment. A directive of the form “PRE.ActQ” in the starting SCS of ActP
initiates the preemption. This directive links directly to the preemption sub-
segment of ActQ where the consequences of the preemption are specified.
In other words, an activity instance can disrupt the duration of some lower
priority instance that is currently accessing the resource. There is however
an implication here that some entity (e.g., a consumer entity instance) that is
connected to the resource will be displaced. When this occurs, the comple-
tion of the service function for the displaced entity is suspended and con-
sequently the duration of the activity instance, from the perspective of the
displaced entity, becomes distributed over at least two disjoint time inter-
vals, or in the extreme case may never even be completed.
An interruption accommodates the impact that changes in the value of an
input variable can have on one or more of the activity instances within the
model. For example, in response to a change in value of an input variable
Table 6.4
Template for the Extended Activity
Extended Activity: Name
A description of the Extended Activity called Name
Precondition Boolean expression that specifies the conditions for initiation
Event SCS associated with initiation
Duration The duration (typically acquired from an attribute)
Preemption
Event SCS associated with preemption
Interruption
Precondition Boolean expression that specifies the conditions under which an
interruption occurs
Event SCS associated with interruption
Event SCS associated with termination
The ABCmod Conceptual Modeling Framework 155
(see Section 6.3.5), an activity instance may undergo a change in the manner
in which it completes the task that was initially undertaken. An interruption
can be treated as an event inasmuch as it is associated with a set of changes
as reflected in an SCS within an interruption subsegment. The subsegment
also provides the means for formulating the condition that defines the occur-
rence of the interruption.
To accommodate the requirements involved in handling an intervention,
a more general activity construct is necessary. This construct is called an
Extended Activity.
Extended Activity: As its name suggests, this construct can accommodate
more general behavior and is the most comprehensive of the activity con-
structs. Its template is given in Table 6.4
The notion of interruption is equally relevant to a Triggered Activity. This
gives rise to a generalization of the Triggered Activity construct that we call
an Extended Triggered Activity.
Extended Triggered Activity: Like its basic counterpart, the distinguishing
feature of an Extended Triggered Activity is that its initiation is not estab-
lished by a precondition but rather by an explicit reference to it within the
terminating event of some activity construct. The template for an Extended
Triggered Activity is given in Table 6.5.
Table 6.6 summarizes several important features of the various activity
constructs.
Table 6.5
Template for the Extended Triggered Activity
Extended Triggered Activity: Name
A description of the Extended Triggered Activity called Name
Event SCS associated with initiation
Duration The duration (typically acquired from an attribute)
Preemption
Event SCS associated with preemption
Interruption
Precondition Boolean expression that specifies the conditions under
which an interruption occurs
Event SCS associated with interruption
Event SCS associated with termination
156 Conceptual Modeling for Discrete-Event Simulation
Table 6.6
Features of the Activity Constructs
Extended
Triggered Extended Triggered
Feature Activity Activity Activity Activity
Precondition Yes No Yes No
Starting Event Yes Optional Yes Optional
Duration Yes Yes Yes Yes
Intervention No No Yes Yes
Terminating Optional Optional Optional Optional
Event
Table 6.7
Template for the Conditional Action
Conditional Action: Name
A description of the Conditional Action called Name
Precondition Boolean expression that specifies the condition for initiation
Event The associated SCS
duration, i.e., it unfolds at a single point in time. There are two direct conse-
quences of this feature; namely, an action construct has a single SCS and the
concept of instances of an action construct is not meaningful.
There are two types of action construct and they are called the Conditional
Action and the Scheduled Action. Since action constructs correspond to events,
a fundamental requirement is the characterization of the condition that
causes the occurrence of the underlying event. In the case of the Conditional
Action we retain a parallel with the activity constructs and refer to this char-
acterization as the precondition for the Conditional Action. The template for
the Conditional Action has the form shown in Table 6.7.
The Conditional Action is frequently used to accommodate a circumstance
where the current state of the model inhibits a particular state change that needs
to take place. In effect, the need for a delay of uncertain length is thus intro-
duced. In this circumstance the Conditional Action serves as a sentinel that
awaits the development of the conditions that permit the state change to occur.
The Scheduled Action corresponds to a scheduled event and hence its
occurrence is autonomous in the sense that it depends only on time, t, and is
independent of the model’s state. Often the event in question is reoccurring
and the requirement therefore is to characterize the points in time (the “time
set”) when the underlying event occurs. The template for the Scheduled
Action is shown in Table 6.8.
As will become apparent in the discussion of Section 6.3.5, the Scheduled
Action provides the means for handling the notion of input within the
ABCmod framework.
The ABCmod Conceptual Modeling Framework 157
Table 6.8
Template for the Scheduled Action
Scheduled Action: Name
A description of the Scheduled Action called Name
TimeSet Characterization of the points in time where the
underlying event occurs
Event The associated SCS
6.3.5 Input
Our particular interest now is with characterizing input within the context of
formulating an ABCmod conceptual model. The perspective that we adopt is
that the notion of input in the DEDS domain has three constituents; namely,
the following:
Any particular ABCmod conceptual model may have many inputs; however,
there is no requirement for representation from all of these categories.
Consider a variable, u, that represents an input from either category (a) or
(b). This variable is, in fact, a function of time; i.e., u = u(t) and the essential
information about it is normally provided by a sequence of ordered pairs of
the form: < (tk, uk): k = 0,1,2, ---- > where tk is a value of time and uk = u(tk) (we
assume that ti < tj for i < ¸j). Each of the time values, tk, in this sequence iden-
tifies a point in time where there is a noteworthy occurrence in the input, u
(e.g., a change in value). We refer to this sequence as the characterizing sequence
for u and denote it as CS[u]; i.e.,
The specifications that allow the construction of CS[u] are part of the data
modeling task associated with model development. In this regard, however,
note that there are two separate sequences that can be associated with CS[u].
These are:
CSD[u] = < tk: k = 0, 1, 2, … > CSR[u] = < uk: k = 0, 1, 2, … > (6.2)
158 Conceptual Modeling for Discrete-Event Simulation
which we call, respectively, the domain sequence for u and the range sequence
for u. It is almost always true that the domain sequence for u has a stochastic
characterization; i.e., a stochastic data model. Generally, this implies that if tj
and tj + 1 = tj + ∆j are successive members of CSD[u], then the value of ∆j is pro-
vided by a stochastic model. The range sequence for u may or may not have
a stochastic characterization.
From the perspective of developing inputs we assume that the data mod-
eling task has been completed. This, in particular, means that valid mecha-
nisms for creating the domain sequence and the range sequence for each
input variable are available.
In some circumstances the input variable, u(t) being considered, falls in the
class of piecewise constant (PWC) time functions. An example of this case
is shown in Figure 6.4. Here u(t) could represent the number of electricians,
at time t, included in the maintenance team of a large manufacturing plant
that operates on a 24-hour basis but with varying levels of production (and
hence varying requirements for electricians). The behavior of the model over
the interval [tj, tj + 1) likely depends directly on the value uj = u(tj) hence the
representation of u(t) as a PWC function is not only meaningful but is, in fact,
essential. The characterizing sequence for u(t) as shown in Figure 6.4 is:
Observe also that with the interpretation given above this particular input is
somewhat distinctive inasmuch as neither its domain sequence nor its range
sequence will likely have a stochastic characterization.
As an alternate possibility consider a case where, u(t), represents the num-
ber of units of a particular product P requested on orders received (at times
tη, tη + 1, … tj …) by an Internet-based distributing company (η = 0 if the first
u(t)
0
t0 t1 t2 t3 t4 t5 t6 t
Fig ur e 6.4
A piecewise constant time function. (From on Birta, L.G. and Arbez, G., Modeling and Simulation:
Exploring Dynamic System Behavior, Springer, London, Fig. 4.3, p. 115, 2007. With kind permission
of Springer Science and Business Media.)
The ABCmod Conceptual Modeling Framework 159
order arrives at the left boundary of the observation interval, otherwise η = 1
where t1 > t0). The characterizing sequence would be written as:
Note, however, that only the specific values uη = u(tη), uη + 1 = u(tη + 2), … uj = u(tj)
are relevant. In other words, representation of this particular input as a PWC
time function is not appropriate because the value of u between order times
has no meaning. Note also that the data model for this input would need to
provide a specification for both the domain sequence CSD[u] of order times
and the range sequence CSR[u] of order values as shown in (6.5). Both would
likely be in terms of specific probability distribution functions.
Consider now a variable s = s(t) that represents an input from category (c);
i.e., an input entity stream. Recall that the entities in question here would
necessarily be instances of some particular entity structure.* The character-
izing sequence s can be written as:
CS[s] = < (tη, 1), (tη + 1, 1), (tη + 2, 1), --- (tj, 1) --- > (6.6)
Here each value in the domain sequence < tη, tη + 1, tη + 2, … tj … > is the arrival
time of an instance of the entity structure in question. Each element of the
range sequence has a value of 1; i.e., s(tj) = 1 for all j because we generally
assume that arrivals occur one at a time. As above η = 0 if the first arrival
occurs at the left boundary of the observation interval, otherwise η = 1. The
domain sequence is constructed from the arrival process associated with the
entity structure in question.
All three categories of input have a characterizing sequence and hence a
domain sequence. The impact of inputs from each of the categories is cap-
tured in the ABCmod framework by a Scheduled Action whose time set is
the domain sequence. It should be emphasized that it is only in limited cir-
cumstances that the domain sequence is deterministic; generally the values
in the domain sequence evolve in a stochastic manner.
The salient features of the inputs for any ABCmod conceptual model are
summarized in a template. The format of this template is shown in Table 6.9.
The general format of the associated Scheduled Actions is shown in Table 6.10.
6.3.6 Output
The output of a simulation experiment can be identified with the information
that is either explicitly or implicitly required for achieving the goals of the
* The notion of an input entity stream carries the implication of transient existence; hence the
entity structures that we associate with this notion always have scope =Class.
160 Conceptual Modeling for Discrete-Event Simulation
Table 6.9
Template for Inputs
Inputs
Variable Description Scheduled Action
e-Inputs
u(t) Description of the input variable u(t) Name of the associated
Scheduled Action
Independent Variables
u(t) Description of the input variable u(t) Name of the associated
Scheduled Action
Input Entity Streams
s(t) Description of the input entity stream which Name of the associated
that the input variable s(t) represents Scheduled Action
Table 6.10
Templates for the Scheduled Actions for Inputs
Scheduled Action: uName
TimeSet t = tk ∈ CSD [u] as defined by DM.uDomain
Event Typically the assignment to the variable u of the value that it acquires at time
t = tk ∈ CSD [u] as prescribed by CSR[u], which is provided by a designated
data module; e.g., DM.uRange.
(a) Case where the Scheduled Action corresponds to the e-input variable, u(t)
Table 6.11
Template for Summarizing Outputs
Outputs
Simple Scalar Output Variables (SSOVs)
Name Description
Y Description of the simple scalar output variable Y
Trajectory Sets
Name Description
TRJ[y] Description of the time variable y(t)
Sample Sets
Name Description
PHI[y] Description of the sample variable y whose values populate the sample set PHI[y]
Table 6.12
Template for Summarizing Data Modules
Data Modules
Name Description Data Model
ModuleName(parameter Statement of the purpose of Details of the mechanism that is
list) the data module called invoked in order to generate the data
ModuleName values provided by the data module
called ModuleName. Typically
involves sampling values from one or
more distributions.
Typically modules are needed to carry out specialized operations that are
distinctive to the specific conceptual model being developed. These can be
freely defined wherever necessary to augment the ABCmod framework and
ease the conceptual modeling task. They are called User-Defined Modules
and they are summarized in a table whose template is given in Table 6.13.
164 Conceptual Modeling for Discrete-Event Simulation
Table 6.13
Template for Summarizing User-Defined Modules
User-Defined Modules
Name Description
ModuleName(parameter list) Purpose of the user-defined module called ModuleName
Service bays
Holding areas
Parking
lot
Entrance
Fig ur e 6.5
Bigtown Garage schematic.
The ABCmod Conceptual Modeling Framework 167
and the night, and currently there are two mechanics working during
each of these shifts. Although the two extra service bays currently provide
some passive utility (see description below) they do offer the possibility for
increased throughput if additional mechanics were hired for the one or more
of the shifts.
Vehicles arrive for service either because of routine maintenance require-
ments or because they require repair due to mechanical failure. The vehicles
scheduled for routine maintenance on any particular day can be assumed to
arrive at the beginning of the day shift; i.e., 8:00 a.m. When a vehicle arrives,
a work order is filled out. This summarizes the nature of the service require-
ment. The time of the vehicle’s arrival at the garage is also noted because
vehicles are serviced in order of their arrival but with due recognition of the
priority of police vehicles. In the case where vehicles have the same arrival
time stamp, the vehicles are serviced according to the order in which the
work orders were filled out.
The priority given to police vehicles implies that no ancillary vehicle is
moved into a service bay from the parking lot until there is no remaining
police vehicle waiting to be serviced.
A police vehicle in the parking lot is moved into a service bay if either (a)
there is at least one mechanic who is idle, or (b) there is at least one mechanic
carrying out a servicing task on an ancillary vehicle; where (b) is applied
only when there are no idle mechanics.
In the case of (a), there is at least one unoccupied service bay and the police
vehicle is moved into one of them. Both the choice of the bay (if there is more
than one that is empty) and the allocated mechanic are random selections.
In the case of (b), work on an ancillary vehicle is stopped thereby releasing
a mechanic to work on the police vehicle. If there is an empty service bay,
the police vehicle is moved there and the freed mechanic moves to that bay
to begin the servicing work. If, on the other hand, there is no empty service
bay,* then the ancillary vehicle in the service bay of the freed mechanic is
moved into one of four holding areas within the garage thereby releasing
a service bay for the police vehicle. It may occur that a group of displaced
ancillary vehicles is thus created. Work on these vehicles is resumed and
completed (provided there are no waiting police vehicles).
* This will only occur when there are more than two mechanics working during a shift which
is a situation that arises when solution options are explored in the simulation study.
168 Conceptual Modeling for Discrete-Event Simulation
of service (again separately for each of the two categories of vehicle); (c) the
average number of busy mechanics
The city manager intends to explore alternate scenarios that correspond
to a deteriorating police car fleet. Three scenarios are of particular interest.
They correspond to the cases where the distributions of interarrival time
for breakdown repair for police vehicles are scaled so that their mean val-
ues are decreased first by 20%, then by 40%, and finally by 60% from their
current operational values. The effect of increasing the number of mechan-
ics working at the garage to 3 and 4 is also of interest for each of the three
scenarios.
RG.Bays
G.Holding
iC.PoliceVeh iC.AncillaryVeh
Q.ParkedPV
Q.ParkedAV
Fig ur e 6.6
Bigtown Garage structural diagram.
The ABCmod Conceptual Modeling Framework 169
Notes
• Mechanics are not explicitly modeled. Instead, RG.Bays will have
the attribute RG.Bays.freeMechanics to indicate the number of idle
mechanics (that is mechanics that are not currently servicing a
vehicle). The number of mechanics present during each of the shifts
is a model parameter.
(a) (b)
ServicePV ServiceAV
Fig ur e 6.7
Bigtown Garage behavioral diagram.
Structural Components
Consumer Class: PoliceVeh
The police vehicles that need servicing.
Attributes Description
state Set to SERVICING when the vehicle is being serviced, NOTSTARTED
when servicing has not yet started.
serviceType Indicates type of service required; values are: BR for breakdown
repair and RM for routine maintenance.
arrivalTime The time at which the police vehicle arrived at the garage.
Constants (Continued)
Name Role Value
MeanAV_Night Mean interarrival time of ancillary vehicles requiring TBD
breakdown repair during the night shift.
MeanBR_ Mean service time for breakdown repair. TBD
ServiceTime
MeanRM_ Mean service time for routine maintenance. TBD
ServiceTime
Parameters
Name Role Value
MeanPV_Night Mean interarrival time of police vehicles requiring MP1, 0.8*MP1,
breakdown repair during the day shift. 0.6*MP1, 0.4*MP1
MeanPV_ Mean interarrival time of police vehicles requiring MP2, 0.8*MP2,
Evening breakdown repair during the evening shift. 0.6*MP2, 0.4*MP2
MeanPV_Night Mean interarrival time of police vehicles requiring MP3, 0.8*MP3,
breakdown repair during the night shift. 0.6*MP3, 0.4*MP3
NumMechanics Number of mechanics working at the garage. 2, 3, 4
Data Modules
Name Description Data Model
NumRMVehicles() Gives the number of vehicles that Uniform(NumRMMin,
arrive for routine maintenance. NumRMMax)
A fraction (FractionPV) of this
number are police vehicles.
InterArrivalPV_BR( ) Gives the interarrival times of If Shift = DAY:
police vehicles arriving for Exponential(MeanPV_Day)
breakdown repair. If Shift = EVENING:
Exponential(MeanPV_Evening)
If Shift = NIGHT:
Exponential(MeanPV_Night)
InterArrivalAV_BR( ) Gives the interarrival times of If Shift = DAY:
ancillary vehicles arriving for Exponential(MeanAV_Day)
breakdown repair. If Shift = EVENING:
Exponential(MeanAV_Evening)
If Shift = NIGHT:
Exponential(MeanAV_Night)
ServiceTime Gives the time to service a If serviceType = RM:
(serviceType) vehicle according to the value of Exponential(MeanRM_
serviceType. ServiceTime)
If serviceType = BR:
Exponential(MeanBR_
ServiceTime)
RMArrivals() Gives the arrival times of Every 24 hours starting at t = 0;
vehicles for routine i.e., t = 24k, k = 0,1,2 …
maintenance.
ShiftChangeTimes() Gives the points in time when a Every 8 hours starting at t = 8;
shift change occurs. i.e., t = 8k, k = 1,2. 3 …
The ABCmod Conceptual Modeling Framework 173
Input Components
Inputs
Variable Description Scheduled
Action
Independent Variables
Shift Reflects the current shift; values are: DAY (day shift), ShiftChange
EVENING (evening shift) or NIGHT (night shift).
Input Entity Streams
RMInput Vehicles requiring routine maintenance. RMArr
BRInputPV Police vehicles requiring breakdown repair. BRArrPV
BRInputAV Ancillary vehicles requiring breakdown repair. BRArrAV
Output Components
Outputs
Trajectory Sets
Name Description
TRJ[NumBusyMechanics] NumBusyMechanics = NumMechanics − RG.Bays.freeMechanics
Sample Sets
Name Description
PHI[WaitServiceAV] Each value is the time spent by some ancillary vehicle waiting
for service to begin.
PHI[WaitServicePV] Each value is the time spent by some police vehicle waiting for
service to begin.
PHI[TotalTimeAV] Each value is the elapsed time from arrival to completion of
service for some ancillary vehicle.
PHI[TotalTimePV] Each value is the elapsed time from arrival to completion of
service for some police vehicle.
Derived Scalar Output Variables (DSOV’s)
Name Description Output Set Name Operator
AvgWaitSrvAV Average time spent by PHI[WaitServiceAV] MEAN
ancillary vehicles waiting for
service to begin.
AvgWaitSrvPV Average time spent by police PHI[WaitServicePV] MEAN
vehicles waiting for service to
begin.
AvgTotalTimeAV Average elapsed time from PHI[TotalTimeAV] MEAN
arrival to completion of service
for ancillary vehicles.
AvgTotalTimePV Average elapsed time from PHI[TotalTimePV] MEAN
arrival to completion of service
for police vehicles.
AvgBusyMech Average number of busy TRJ[NumBusyMechanics] MEAN
mechanics.
174 Conceptual Modeling for Discrete-Event Simulation
Behavioral Components
Time units: hours
Observation interval: t = 0 corresponds to 8:00 a.m.; steady state study, hence
right-hand boundary to be determined by experimentation.
User Defined Modules
Name Description
GetAVBeingServiced(iC. Returns TRUE if an ancillary vehicle being serviced in RG.Bays
AncillaryVeh) (i.e., iC.AncillaryVeh.status = SERVICING). Furthermore, sets
iC.AncillaryVeh to reference an AncillaryVeh entity being
serviced (random selection).
ServiceBayAvailablePV() Returns TRUE if an additional police vehicle can be
accommodated in RG.Bays. This occurs if either there is an idle
mechanic (RG.Bays.freeMechanics ≠ 0) or if there is an ancillary
vehicle being serviced in RG.Bays. Note that a free mechanic
implies a free bay.
AVMidService(iC. Returns TRUE if an ancillary vehicle is in mid-service in
AncillaryVeh) RG.Bays (i.e., iC.AncillaryVeh.state = MIDSERVICE).
Furthermore, sets iC.AncillaryVeh to reference an AncillaryVeh
entity in mid-service (random selection).
Initialize
RG.Bays.freeMechanics ← NumMechanics
Shift ← DAY
Activity: ServicePV
Servicing a police vehicle.
Precondition UM.ServiceBayAvailable() AND Q.ParkedPV.n ≠ 0
Event iC.PoliceVeh ← SM.RemoveQue(Q.ParkedPV)
SM.Put(PHI[WaitServicePV], t − iC.PoliceVeh.arrivalTime)
IF RG.Bays.freeMechanics ≠ 0 // Free bay exists in RG.Bays
Decrement RG.Bays.freeMechanics
ELSE // Need to preempt service on ancillary vehicle
UM.GetAVBeingServiced(iC.AncillaryVeh)
PRE.ServiceAV(iC.AncillaryVeh)
(Continued)
176 Conceptual Modeling for Discrete-Event Simulation
6.6 Conclusions
A meaningful conceptual model is essential for a successful simulation
study. The development process is driven by the goals that have been iden-
tified for the modeling and simulation project and focuses on capturing
the structural and behavioral features of the SUI that are relevant to the
achievement of those goals. Furthermore the process itself serves as a vehi-
cle that allows all project stake holders to participate in the identification of
these structural and behavioral features. The model that evolves serves as
the blueprint for the development of the program code for carrying out the
simulation study. The ABCmod framework outlined in this chapter pro-
vides an environment designed specifically to facilitate the achievement
of these fundamental objectives of the conceptual modeling task. It has
been extensively used for several years in a senior undergraduate/junior
g raduate course where students carried out group projects. The wide range
of nontrivial student projects that have been completed provide a convinc-
ing body of evidence that the framework does achieve its intended purpose
very effectively.
Dealing with detail and complexity is an essential requirement of any con-
ceptual modeling environment. This is accommodated in the ABCmod con-
ceptual modeling framework by providing a two stage hierarchal approach.
Included at the initial high-level stage is the identification of both the mod-
eling artifacts that map onto objects within the SUI that have relevance to
the model development process and, as well, the modeling constructs that
capture the behavioral features of these artifacts. These correspond, respec-
tively, to the identification of the entity structures and the behavior constructs
pertinent to the model. The second stage is concerned with specifying an
appropriate level of detail for the entity structures and the behavior con-
structs that have been identified.
The high-level model is presented using a graphical format. This includes
structural diagrams and a collection of life-cycle diagrams that show how
entities move among the behavior constructs. The detailed level model is
presented using tables with predefined formats. The text-based tabular for-
mat provides the important advantage of accommodating arbitrary complex-
ity. In particular, straightforward mechanisms are provided for dealing with
the disruption of entity flow through an activity (e.g., either interruption or
preemption).
The complexity inherent in large systems is best handled by formulat-
ing an interacting subsystem perspective often organized in a hierarchical
manner. Such a perspective naturally is reflected into the conceptual model-
ing process. Extensions to the ABCmod framework that will conveniently
accommodate such a hierarchical perspective are currently under way. A
software tool that supports the creation of ABCmod conceptual models is
also currently under development.
178 Conceptual Modeling for Discrete-Event Simulation
References
Arbez, G., and L.G. Birta. 2007. ABCmod: A conceptual modelling framework for
discrete event dynamic systems. In Proceedings 2007 Summer Computer Simulation
Conference. San Diego.
Banks, J., J. S. Carson II, B.L. Nelson, and D.M. Nicol. 2005. Discrete Event System
Simulation, 4th ed. New Jersey: Prentice-Hall.
Birta, L.G., and G. Arbez. 2007. Modelling and Simulation: Exploring Dynamic System
Behaviour. London: Springer.
Buxton, J.N., and J.G. Laski. 1962. Control and simulation language. Computer journal
5: 194–199.
Hills, P.R. 1973. An introduction to simulation using SIMULA. Publication No. S55. Oslo:
Norwegian Computing Center.
Kreutzer, W. 1986. System Simulation: Programming Styles and Languages. Sydney,
Wokingham: Addison-Wesley.
Overstreet, C.M., and R. E. Nance. 2004. Characterizations and relationships of world
views. In Proceedings 2004 Winter Simulation Conference, 279–287. Washington,
DC.
Padulo, L., and M.A. Arbib. 1974. System Theory: A Unified State–Space Approach to
Continuous and Discrete Systems. Philadelphia: W.B. Saunders.
Pidd, M. 2004a. Computer Simulation in Management Science, 4th ed. Chichester: John
Wiley.
Pidd, M. 2004b. Simulation world views: So what? In Proceedings 2004 Winter
Simulation Conference. Washington, D.C.
Pritsker, A.B. 1986. Introduction to Simulation and SLAM II, 3rd ed. New York: Hallstead
Press (John Wiley).
Robinson, S. 2004. Simulation: The Practice of Model Development and Use. Chichester:
John Wiley.
Shannon, R.E. 1975. System Simulation: The Art and the Science. New Jersey: Prentice-
Hall.
Tocher, K.D. 1963. The Art of Simulation. London: English Universities Press.
7
Conceptual Modeling Notations
and Techniques
Contents
7.1 Introduction................................................................................................. 179
7.1.1 Uses of Conceptual Modeling....................................................... 180
7.2 Conceptual Modeling Frameworks, Notations, and Techniques........ 182
7.2.1 KAMA Conceptual Modeling Framework................................. 183
7.2.1.1 KAMA Method................................................................ 184
7.2.1.2 KAMA Notation............................................................... 186
7.2.1.3 KAMA Tool....................................................................... 191
7.2.2 Federation Development and Execution Process (FEDEP)....... 191
7.2.3 Conceptual Models of the Mission Space (CMMS).................... 195
7.2.4 Defense Conceptual Modeling Framework (DCMF)................ 197
7.2.5 Base Object Model (BOM)............................................................. 199
7.2.5.1 Model Identification......................................................... 200
7.2.5.2 Conceptual Model Definition......................................... 200
7.2.5.3 Model Mapping................................................................ 202
7.2.5.4 Object Model Definition.................................................. 203
7.2.5.5 BOM Integration............................................................... 203
7.2.6 Robinson’s Framework................................................................... 204
7.3 A Comparison of Conceptual Modeling Frameworks.......................... 205
Acknowledgments............................................................................................... 207
References.............................................................................................................. 207
7.1 Introduction
Conceptual modeling is a tool that provides a clear understanding of the tar-
get domain or problem. In the simulation system development life cycle, con-
ceptual models should be captured early based on project objectives defining
what is intended and then should serve as a frame of reference for the subse-
quent development phases. The conceptual model can be interpreted as part
of a problem-specification process and defined as a simplified representa-
tion of the real system having the following features: (a) includes structural
179
180 Conceptual Modeling for Discrete-Event Simulation
neither well understood nor [clearly] expressed,” he suggests that formal prob-
lem structuring methods should be utilized (Balci and Nance 1985, Robinson
2007b) define a methodology for problem formulation in simulation system
development. Mojtahed et al. (2005) take this approach one step further and
treat the conceptual analysis phase as a knowledge engineering activity.
Problem definition in simulation system development is a part of the
requirements analysis phase; therefore conceptual models are the products
of this phase. Specifications, assumptions, and constraints related with the
domain of interest should be included in the conceptual model. These speci-
fications may include the entities, tasks, actions, and interactions among the
entities, which will form a basis for the design phase (DMSO 1997). Pace (1999)
describes the conceptual model as a bridge between requirements analysis
and design phases. Since the boundaries of these phases cannot be sharply
defined, there is confusion over whether the conceptual model is a product
of the user or the designer (Haddix 1998). In order to reduce this confusion,
Haddix defines a conceptual model as “the ultimate definition of the require-
ments” and uses another term, conceptual design, to mean “initial descrip-
tions of the system’s implementation.” However, SISO (2006a) disagree with
these definitions in its BOM (Base Object Model) standard, stating that the
BOMs are defined to “provide an end-state of a simulation conceptual model
and can be used as a foundation for the design of executable software code
and integration of interoperable simulations.”
Being a product of the requirements analysis phase, conceptual models
should be independent of the software design and implementation decisions
(Sheehan 1998, Pace 1999a, IEEE 2003). This aspect of the conceptual model
is based on a software development viewpoint. Johnson (1998) introduces a
slightly different aspect of the conceptual model as providing a “simulation-
neutral view of the real world.” He suggests that the simulation system–
specific attributes, even if they are not related with the design phase, should
be kept out of a conceptual model. Thus, the conceptual model should include
the definitions of a simulation system and it can be realized by different
simulation implementations.
It is an established practice in the software engineering field to initiate
verification and validation activities as early as possible in the software
development life cycle. Software requirements specification is used for
ensuring that the developers are producing what the customer really wants.
Similarly, early validation of a simulation system is essential for the success
of a simulation system development project. Conceptual models can be used
as a basis for verification, validation and accreditation activities (Sargent
1987, Haddix 1998). Sargent underlines that the conceptual model should be
structured enough to provide means for validation. However, a more thor-
ough validation will be possible using experimentation after the simulation
system has been completed. Any defects found during verification and
validation activities should be corrected by revisiting the prior phases
including the conceptual modeling phase. Hence, conceptual modeling is
182 Conceptual Modeling for Discrete-Event Simulation
not a one shot process but rather an iterative one that should be performed
in many cycles throughout a simulation system development study (Balci
1994, Willemain 1995).
models and a tool for supporting the process and the notation (Karagöz and
Demirörs 2007, Karagöz 2008). It was developed as part of a research project
performed with the collaboration of the academia, industry and military and
the framework was validated through case studies (Karagöz 2008) and real
life simulation system development projects (Karagöz et al. 2008).
List of information
sources
Simulation
objectives
Simulation
boundary
Sponsor
List of similar
Acquire knowledge
conceptual models
Modeller
Mission space
Define context
diagram
SME Conceptual
model
Develop content diagrams
Modeller
Verification
Verify and validate the model report
Reviewer Validation
report
is model verified
NO
YES
SME
Fig ur e 7.1
Flow diagram for the KAMA method. (Based on Karagöz, N.A., A Framework for Developing
Conceptual Models of the Mission Space for Simulation Systems, PhD thesis, Middle East
Technical University, Department of Information Systems, 2008.)
Superior
Actor
Source Entity
Part Subordinate
Target
Parent
Aggregate
Child
Includes Extends
Produces
Consists of
Work product Input to
Fig ur e 7.2
KAMA metamodel elements. (From Karagöz, N.A., A Framework for Developing Conceptual
Models of the Mission Space for Simulation Systems, PhD thesis, Middle East Technical
University, Department of Information Systems, 2008.)
task flow, entity ontology, entity relationship, entity state, command hierar-
chy, and organization structure diagrams.
A sample mission space diagram presented in Figure 7.4, which looks sim-
ilar to a UML use case diagram, shows the high-level missions of a package
in a simulation system. Three roles have been specified that are responsible
for or in charge of realizing the missions. Roles may stand for real life people
such as commander in our sample or actively participating entities such as
a sensor or a platform. Perform Mine Hunting mission includes the Detect
Mines mission and is extended by two different missions. The extending
missions are (a) Hunt Mine With Unmanned Undersea Vehicle (UUV) and
(b) Hunt Mine With Acoustic Mine. These missions share the same objec-
tive but use different techniques or tools. UUVs are used to destroy mines
by remote operations. Acoustic mines are used to trigger and destroy other
mines by making them explode using acoustic waves. For each mission to be
188 Conceptual Modeling for Discrete-Event Simulation
UML/
MOF
Import
Foundation
Extends Extends
Extends
Dynamic Mission
Structure
behavior space
Import
Import
Fig ur e 7.3
KAMA package hierarchy. (From Karagöz, N.A., A Framework for Developing Conceptual
Models of the Mission Space for Simulation Systems, PhD thesis, Middle East Technical
University, Department of Information Systems, 2008.)
ExtensionId=5
Condition:
Commander Hunting method=acoustic
Realizes
ExtensionId=5
Condition:
Hunting method=UUV Extends
Achieves
Extends
Specified_area Platform
Hunt mine with UUV Hint mine with acoustic mine
Quantified by
Conceptual Modeling Notations and Techniques
Quantified by Quantified by
Fig ur e 7.4
Example mission space diagram. (From Karagöz, N.A., A Framework for Developing Conceptual Models of the Mission Space for Simulation Systems,
PhD thesis, Middle East Technical University, Department of Information Systems, 2008.)
189
190 Conceptual Modeling for Discrete-Event Simulation
Mine hunting
order
Detect mine
Platform
YES
Identify mine
NO
Mine
identified? Determine estimated effective hunting method
YES
Ext
Hunt mine
Mine Hunting
Ext
Report
Fig ur e 7.5
Example task flow diagram. (From Karagöz, N.A., A Framework for Developing Conceptual
Models of the Mission Space for Simulation Systems, PhD thesis, Middle East Technical
University, Department of Information Systems.)
Conceptual Modeling Notations and Techniques 191
arrival of the Mine Hunting Order input. All of these tasks except for “Detect
mine” are realized by the Commander role as specified in the Mission Space
diagram, therefore the “realizes” relation is shown only for the “Detect mine”
relation on the diagram.
The shaded tasks denote the existence of task flow diagrams, which
include the details of these tasks. The Mine Information output, which is
partly produced by the Detect Mine task and then updated by the Identify
Mine task, is used as an input to the “Determine estimated effective hunting
method” task. The Hunt Mine task is an extension point with extensionId
equals to five. The extensionId information is not shown on the diagram but
recorded as an attribute of the task. The two extending missions that are
shown in Figure 7.4 extend the Perform Mine Hunting mission depending
on the selected hunting method. A Mine Hunting Report is produced as a
result of the execution of the Hunt Mine task. The variations in the task flow
diagram are represented with decision points, which may have any number
of outgoing control flows. However, the guard conditions shown on each
outgoing control flow must not contradict with each other.
1—Define federation
objectives
3—Design federation
4—Develop federation
6—Execute federation
and prepare outputs
Fig ur e 7.6
FEDEP high-level process flow. (Reproduced from IEEE Computer Society, IEEE 1516.3,
Recommended Practice for High Level Architecture (HLA) Federation Development and
Execution Process (FEDEP), 2003.)
model is defined as “the document that describes what the federation will
represent, the assumptions limiting those representations, and other capa-
bilities needed to satisfy the user’s requirements. Federation conceptual
models are bridges between the real world, requirements, and design.”
In order to comply with these definitions, this step of the FEDEP process
begins with developing federation scenarios that are based on the federation
requirements. The relationship between the activities of this step, the con-
sumed inputs and produced outputs are depicted in Figure 7.7. Federation sce-
narios define the boundaries of conceptual modeling activities. Authoritative
information sources should be identified prior to scenario construction. A
federation scenario includes “the types and numbers of major entities that
must be represented by the federation, a functional description of the capa-
bilities, behavior, and relationships between these major entities over time,
194 Conceptual Modeling for Discrete-Event Simulation
2.1—Develop (federation)
scenario
Inputs: Federation
objectives, existing
scenarios, authoritative
domain information
2.2—Develop federation
conceptual model
2.3—Develop federation
requirements
Inputs: Federation
objectives, developed
federation scenario,
developed federation
conceptual model
Fig ur e 7.7
Perform conceptual analysis steps. (Reproduced from IEEE Computer Society, IEEE 1516.3,
Recommended Practice for High Level Architecture (HLA) Federation Development and
Execution Process (FEDEP), 2003.)
Based on these findings, the objectives of DCMF were defined as “to cap-
ture authorized knowledge of military operations, to manage, model and
198 Conceptual Modeling for Discrete-Event Simulation
Fig ur e 7.8
DCMF process: Main phases. (Reproduced from Mojtahed, V., Lozano, M.G., Svan, P., Andersson,
B., and Kabilan, V., Technical Report, FOI-R—1754—SE, 2005.)
Conceptual Modeling Notations and Techniques 199
readable and unambiguous format. This can be done by using methods like
SPO (Subject-Predicate-Object), 5Ws (Who-What-Where-When-Why), and
KM3, which are explained in Mojtahed et al. (2005). These analyses will
result in an ontology, which consists of the context of the domain, the defini-
tions of terms and their relationships and interactions.
Knowledge Modeling (KM) phase focuses on the semantic analysis and
modeling of the information. Although previous KR phase may produce
usable artifacts, building a common general model at the right level of
abstraction requires further study. Different models can be generated based
on the same set of data. These models should be suitable for future use and
reuse. In order to provide this facility, the DCMF proposes using knowledge
components that represent smaller knowledge parts. This approach pro-
vides flexibility, increases the rate of reuse and composability of conceptual
models. Knowledge modeling also involves the merging of these knowledge
components or conceptual models; therefore it will be a good idea to store
these artifacts in a knowledge repository.
The last phase of the DCMF process is Knowledge Use (KU), which deals
with the actual use of the artifacts produced as a result of the previous
phases. DCMF suggests using effective mechanisms that provide different
visualizations of the knowledge for various users. These users may include
the sponsor, consumer, producer and controller. The original intent of a
knowledge component and any changes made to it should be recorded for
an effective usage mechanism.
KM3 is at the same time a specification, a tool and a language. KM3 is a
specification for the creation of generic and reusable conceptual models. It is
a tool for structuring knowledge in the form of generic templates. It is a com-
mon language that enables different stakeholders in developing conceptual
models. KM3 follows an activity-centric approach and represents activities
as KM3 actions. KM3 specification includes both static and dynamic descrip-
tions. The static descriptions are specified by the attributes of an object
whereas the dynamic descriptions are specified by the inclusion of rules into
the object descriptions. All changes to model elements are described by rule
definitions, which specify the conditions under which an action starts and
ends. A rule is composed of an activity role and an atomic formula. Atomic
formulas can be combined conjunctively (OR-Connections) or disjunctively
(AND-Connections) to create complex formulas.
Pattern of interplay Entity type mapping HLA object classes Lexicon (definitions)
State machine Event type mapping HLA interaction classes
Entity type HLA data types
Conceptual Modeling Notations and Techniques
Event type
Fig ur e 7.9
BOM composition. (Reproduced from Base Object Model (BOM) Template Specification, SISO-STD-003-2006, 2006a.)
201
202 Conceptual Modeling for Discrete-Event Simulation
may be represented, the entity type and event types defined in the concep-
tual model. This definition is closely matched with the FEDEP steps related
with conceptual modeling and provides a description of what the simulation
component, simulation or federation “will represent, the assumptions limit-
ing those representations and other capabilities needed to satisfy the user’s
requirements” (IEEE 2003).
The pattern of interplay template component is used to identify the sequences
of actions necessary for fulfilling the pattern of interplay that may be repre-
sented by a BOM. In addition to the main course of events, the variations and
exceptions are also represented as pattern descriptions. A pattern of interplay
may be composed of many pattern actions each of which includes the sender
entity, the receiver entity and optionally the variations and exceptions.
The state machine template component is used to identify the behavior
states of a conceptual entity that are required to support one or more pat-
terns of interplay. BOM DIF defines the State Machine Table for describing
one or more state machines. A state machine table includes the name of the
state machine, the conceptual model entities that support the states defined,
and the behavior states that are supported by a conceptual entity. A name
and exit condition is defined for each state. Each exit condition identifies an
exit action and the next state upon satisfying the exit action.
The entity type template component provides a mechanism for describing
the types of entities. It is used to identify the conceptual entity types required
to support the patterns of interplay and executing the various state machines.
An entity type may play the role of a sender or receiver in a pattern of inter-
play or may be associated with a state machine. The entity type is identified
by a name and associated characteristics. An example entity type may be a
“waiter” having the “name” and “assigned tables” as characteristics.
The event type is used to “identify the type of conceptual events used to
represent and carry out the actions variations and exceptions within a pat-
tern of interplay.” The two types of BOM events are BOM Triggers and BOM
Messages, which represent undirected and directed events, respectively. In an
undirected event the sender of the event is known but the receiver is not spec-
ified, so that any entity that has interest may receive the event. In a directed
event both the sender and receiver entities are specified. A BOM trigger is an
undirected event that may occur as a result of a change in the state of an entity
and affects other entities that have interest in such observable changes. For a
BOM trigger, the source entity and the trigger condition are known, but the
target entities cannot be identified. A BOM Message is a directed event that
identifies both of the source and target entities. A Message is an event type
with a target entity, and a trigger is an event type with a trigger condition.
elements of the Object Model Definition. The two types of mapping sup-
ported are Entity Type Mapping and Event Type Mapping.
The entity type mapping is used to map entity types and their associated
characteristics to class structures. An entity type is mapped into an HLA
object class or HLA interaction class, and characteristics of an entity type are
mapped to HLA attributes or HLA parameters.
An event type is mapped into an HLA object class or HLA interaction
class. Source characteristics, target characteristics, content characteristics,
and trigger condition of an event type are mapped to HLA attributes or HLA
parameters.
These mappings are means for transforming conceptual model elements
into object model elements.
describes the depth of the model. The scope of the model can be described in
terms of the entities, activities, queues and resources. The level of detail for
these components can be determined by the judgment and past experience
of the modeler, analysis of preliminary data and prototyping. During this
process various assumptions and simplifications may be made. These are
recorded and classified as high, medium, or low depending on their impact
on the model responses.
Robinson demonstrates this framework with a modeling application at
Ford Motor Company engine assembly plant (Robinson 2007b). He proposes
assessing the model by checking the validity, credibility, utility and feasibil-
ity of the model. Robinson also points out the importance of expressing the
modeler’s mental model as a communicative model and states the useful-
ness of diagrammatic representations for this purpose. He lists some of the
possible diagrammatic notations, however, does not impose any of them for
use with his framework. Robinson defines the conceptual modeling as art
and states that the framework brings some discipline to that art. The artistic
characteristic of conceptual modeling, combined with the different perspec-
tives of the modelers and the domain experts make it impossible to define an
absolutely right conceptual model. Therefore, Robinson suggests a concep-
tual modeling framework should provide a means for communicating and
debating on conceptual models rather than aiming to develop a single best
conceptual model.
Table 7.1
Conceptual Modeling Approaches
Approach Method/Process Notation Tool Support
FEDEP Includes a process No specific notation is No tool support
definition intended for imposed
HLA
CMMS (FDMS) Process definition does Common lexicon is No tool support
not include detailed defined. Data
guidance Interchange Format
is defined.
DCMF Includes a process Includes KM3 Existing UML
definition notation for modeling tools and
representing ontology tools can
conceptual models be used
BOM Process definition does Includes a text-based BOMworks tool has
not include detailed syntax and been developed
guidance semantics definition. (BOMworks 2009)
UML may also be
used.
Robinson Includes a process No specific notation is No specific tool is
definition imposed, but imposed. Existing
diagrammatic graphical modeling
notations are tools can be used
suggested
KAMA Includes a process UML-based graphical A graphical modeling
definition notation is defined tool has been
developed (Karagöz
et al. 2005)
is active ongoing work on the tool support. DCMF complies with the three
parameters of the framework definition with a focus on the KA activities of
the conceptual modeling phase. Robinson does not define a specific notation
for conceptual modeling but proposes using diagrammatic representation
techniques and does not mention about the tool support. KAMA defines a
notation specific to the conceptual modeling domain and includes a detailed
process definition. The KAMA tool can be used for developing, sharing and
verifying conceptual models.
All of these frameworks have some common limitations. It is difficult to
provide a generic framework that is appropriate for all types of problem
domains, because of their distinct requirements and objectives. Metamodel
based notations may propose a solution to this problem by means of modifi-
able metamodels. However, in such a case the modelers should thoroughly
analyze the tradeoff between a best-fit metamodel and a more general meta-
model that allows more flexibility and reusability.
Conceptual models represented in diagrammatic notations are known
to provide better understanding and communication, however as these
Conceptual Modeling Notations and Techniques 207
diagrams get complicated these advantages are lost and cognitive issues
arise (Kılıç et al. 2008). Diagrams with dynamically adjusted abstraction lev-
els, or multidimensional viewing features may be utilized for overcoming
these issues. The different perspectives of the conceptual modelers and the
domain experts make it almost impossible to define the absolutely right con-
ceptual model, which may also be considered as a cognitive issue.
Acknowledgments
The section on the KAMA framework is mostly reproduced from: Karagöz,
N.A. 2008. A framework for developing conceptual models of the mission
space for simulation systems, PhD thesis, Middle East Technical University,
Department of Information Systems.
Some sections of this chapter are derived from the following resources:
References
Aysolmaz, B. 2007. Conceptual model of a synthetic environment simulation system
developed using extended KAMA methodology. Technical Report 2006–2007:
2–17. Informatics Institute, Middle East Technical University.
Balci, O. 1994. Validation, verification, and testing techniques throughout the life
cycle of a simulation study. Annals of operations research 53: 121–173.
Balci, O., and R.E. Nance. 1985. Formulated problem verification as an explicit require-
ment of model credibility. Simulation 45 (2): 76–86.
Borah, J. 2007. Informal simulation conceptual modeling: Insights from ongoing proj-
ects. In Proceedings of the Simulation Interoperability Workshop. www.sisostds.org
208 Conceptual Modeling for Discrete-Event Simulation
David Haydon
Contents
8.1 Introduction................................................................................................. 212
8.2 Software Project Life Cycle....................................................................... 212
8.3 Requirements.............................................................................................. 213
8.3.1 Contents of the Requirements Document................................... 215
8.3.2 Purpose of the Development......................................................... 215
8.3.3 Stakeholders..................................................................................... 216
8.3.4 Study Objectives............................................................................. 216
8.3.5 Overview of the System to be Modeled....................................... 217
8.3.6 System Perspective......................................................................... 217
8.3.7 General Requirements and Constraints...................................... 217
8.3.8 Specific Requirements.................................................................... 217
8.3.9 Summary of the Requirements Phase......................................... 218
8.4 Design........................................................................................................... 218
8.4.1 Contents of the Design Document............................................... 218
8.4.2 Purpose of the Development......................................................... 219
8.4.3 Stakeholders..................................................................................... 219
8.4.4 System Perspective......................................................................... 219
8.4.5 Overview of the System to be Modeled....................................... 219
8.4.6 Method of Analysis........................................................................ 219
8.4.7 Simulation Structure...................................................................... 220
8.4.8 Detailed Design............................................................................... 221
8.4.9 Inputs and Outputs........................................................................ 221
8.4.10 Summary of the Design Phase......................................................222
8.5 Implementation...........................................................................................222
8.6 Verification ..................................................................................................222
8.7 Validation.....................................................................................................223
8.8 Example of the Methodology....................................................................223
8.8.1 Requirements.................................................................................. 224
8.8.2 Simulation Structure...................................................................... 224
211
212 Conceptual Modeling for Discrete-Event Simulation
8.1 Introduction
This chapter presents a methodology for the design and implementation of
a discrete-event simulation model. It is not the only way to implement such
a model, nor is it necessarily the best—but it is in use and it works. (It is also
used for other types of study, although the general approach is modified to
suit the method of analysis to be used.)
The methodology has been developed over a number of years through trial
and error. Ideas have been culled from a variety of sources. Some have been
tried, found not to be useful and have been dropped. Other ideas have been
found useful and have been kept. Others have been modified or parts of
them used. The resulting methodology covers all aspects of the design and
development of a simulation model—from requirements through design and
development to testing. Just as this methodology has been constructed from
pieces taken from various sources, it is suggested that the reader take those
elements of this approach that are useful to them and incorporate them into
the reader’s own approach.
This approach is consistent with BS EN ISO9001—a quality assurance
standard, although the details of the required procedures and documenta-
tion have been omitted for simplicity.
8.3 Requirements
This section considers the requirements phase of the software project life
cycle. We discuss the purpose of the Requirements Document and outline
its contents.
The purpose of the requirements phase is to document the requirements
for the model. This may sound obvious but it is important to consider why
the model is required. In the commercial world, models are developed to
help address a specific problem and that will define the timescales and bud-
get of the model development. It will also define the accuracy required from
the project and hence the accuracy required from the model. The “Why?”
is therefore to address some specific problem to the required accuracy and
within the given timescales and budget. This also has implications for the
214 Conceptual Modeling for Discrete-Event Simulation
Bid
Requirements
Design
Implement
Increment
Project
management
Verification
Validation
Use
Fig ur e 8.1
The iterative waterfall software life-cycle model.
correct the analyst’s understanding of the system and how it works. Since
the Requirements Document is intended as a definitive statement of what
the model will do it is important that the individual requirements be
clearly identified—and ideally uniquely numbered for ease of reference.
Requirements should be identified as mandatory, desirable or optional;
defined as follows:
The words shall, should, and may, respectively, can be used in the formal
statements of requirement to help differentiate between the requirement
types.
The requirements phase does not include design decisions. All elements of
the design must be left until the design phase.
• Which scenarios
• Which C2 decisions
• What information will be required to make those decisions
• How should the information be generated and displayed
8.3.3 Stakeholders
This section should contain a list of the stakeholders, where a stakeholder
is a person or organization with an interest in the development or use of
the model. A stakeholder may be a person or organization with information
about the system to be modeled.
customer can confirm that the study objectives are correct (assuming that the
document is reviewed by the customer).
The study objectives include timescales, available budget, and required
accuracy.
per subsystem. The specific requirements detail the aspects of the real-world
system that are to be included in the model. The detail of some specific
requirements may be included as annexes. For example, details of particular
algorithms to be used, data formats of external data sources.
8.4 Design
This section considers the design phase of the software project life
cycle. We discuss the purpose of the Design Document and outline its
contents.
The purpose of the design phase is to document the design of the model.
The design is “how” the requirements are to be met. The Design Document
provides the following:
The Design Document is not normally provided to the customer. The cus-
tomer is concerned with what the model does not how it does it.
• Detailed Design
• Inputs and Outputs
8.4.3 Stakeholders
This is essentially a repeat of the stakeholder list contained in the Requir
ements Document. It is a useful reference to help identify source(s) of infor-
mation that can be used to resolve design problems.
The system overview is contained in Ref.…. It is repeated here for ease of refer-
ence but Ref.… remains the definitive statement.
tank becomes full. The system (or that part of it) would change state, say,
from “being filled” to “full,” and we would expect the flow to be stopped.
Simulation activities are the conditional activities that take place when cer-
tain conditions are met. Activities can only start at the time of a simulation
event. This is because if the conditions have not been met at a given time
then the system state is such that the activity cannot start. The conditions
will remain unfulfilled until there is a change in the state of the system.
Since the system state only changes when there is an event, the conditions
can only be met when there is an event. The design of the simulation is the
process of selecting a set of simulation entities that can generate all of the
required simulation events.
The approach to designing the simulation structure is to list all of the
required events—including virtual events that do not exist in the real world
but are required by the model. Then, for each event, list all of the entities that
are involved with that event. Activity Cycle Diagrams (ACD) are a useful
tool in identifying events, entities, and activities. If every event were associ-
ated with one and only one entity then the structure is complete. Every entity
is included in the simulation structure.
In practice, most events will involve multiple entities. Designing the sim-
ulation structure is a matter of selecting a subset of entities that cover all
of the events while minimizing the number of events associated with more
than one of the selected entities. The selection is done by trial and error,
guided by skill and experience. (We did say that it has not been possible
to find a method for designing the optimum simulation structure!) The
process can sometimes be simplified by identifying subsets of the system
where a single entity can be selected and then removing that subset from
consideration.
This approach has the advantage of providing a systematic approach to
the problem of designing the simulation structure. It does not necessarily
provide a simple solution to the problem but it does ensure that nothing is
overlooked.
8.5 Implementation
Implementation is the process of turning the design into a working model.
It depends on the simulation package in use and is outside the scope of this
chapter.
8.6 Verification
Verification is the process of testing that the model as implemented con-
forms to the design. Model verification can be undertaken by using simple
data sets for which the expected results can be easily calculated. These data
sets are likely to exercise limited areas of the model. Data sets should be
selected that, between them, cover all of the functionality of the model. The
results of runs using these data sets should be compared with the expected
results.
Verification of the complete functionality of the model can be performed
by combining selected simple data sets and checking that the results are
consistent with the results of the component runs. (Note that interactions
between model elements would typically increase waiting times and reduce
the combined throughputs.) The complexity of the test data can be gradually
increased until the data sets are similar to those that will be used for the
project.
The final verification tests are soak tests. A series of tests with inputs much
higher than those that would be used in a study and a series of very long
runs. Both series of tests are intended to ensure that the model behaves prop-
erly under extreme conditions.
The results of verification testing should be documented in the Verification
and Validation Log.
Conceptual Modeling in Practice: A Systematic Approach 223
8.7 Validation
Validation tests that the model is fit for purpose:
Due to the complexity of the message handling, it has been decided that the
design project will require analytical support and that a simulation model is
the most cost-effective way to provide that support.
224 Conceptual Modeling for Discrete-Event Simulation
8.8.1 R equirements
Following a Requirements Analysis, the following key requirements have
been identified:
• Messages
• Manual processes
• Automatic processes
• Staff
• Centre systems
• External systems
• Workstations
Conceptual Modeling in Practice: A Systematic Approach 225
But messages are the only one of the above entities that are involved in
all of the main activities. Also, the message dwell time (how long a mes-
sage remains in the Centre) is one of the main outputs required. Thus the
main simulation entity should be the message. (This leaves shift changes and
dynamic reallocation to be handled separately.)
performed by Centre staff and Centre and external systems. This would
answer the above disadvantages as follows:
The resulting design is by no means the only design that could have been
produced, and if the key requirements had been different, a different design
may have been developed. Just as the structure of the model should mir-
ror the structure of the real-world system, it should also mirror the study
requirements.
8.9 Summary
In this chapter we have shown how software engineering principles can be
applied to conceptual modeling for discrete-event simulation models. Those
principles have been modified to reflect the qualitative differences between
simulation models and other types of software applications.
The methodology starts with the requirements phase, which addresses
the “Why?” question. Why do we need a model and what should it do?
The “Why?” question applies to every phase of the model development. The
output from the requirements phase is the Requirements Document. The
Requirements Document has two uses: as an input to the design phase and
as a clear statement of what the model is intended to do that can be reviewed
and approved by the customer.
The requirements phase is followed by the design phase. The output from
the design phase is the Design Document, which documents the design,
including why it was decided to use a simulation model and a discussion of
the design of the simulation structure. Design is followed by implementa-
tion, and then by verification and validation.
The methodology described in this chapter is not the only conceptual
modeling methodology, nor is it necessarily the best. But it has been used in
the development of many simulation models in a commercial environment
and has proven useful.
Part III
Michael Pidd
Contents
9.1 Introduction................................................................................................. 231
9.2 Problem Structuring................................................................................... 233
9.2.1 Complementarity............................................................................234
9.2.2 Informal Problem Structuring: Critical Examination............... 236
9.3 Formal Problem Structuring Methods.................................................... 238
9.4 Soft Systems Methodology........................................................................ 240
9.4.1 The Overall Approach of SSM...................................................... 240
9.4.2 Understanding a Perceived, Real-World Problem Situation..... 243
9.4.3 Power-Interest Grids....................................................................... 244
9.4.4 Root Definitions.............................................................................. 245
9.5 Using Root Definitions............................................................................... 246
9.5.1 Root Definitions for the CaRCs..................................................... 247
9.5.2 Root Definitions for the Simulation Study.................................. 249
9.6 Using Root Definitions to Support Conceptual Modeling................... 250
Acknowledgments............................................................................................... 251
References.............................................................................................................. 251
9.1 Introduction
This chapter argues that simulation analysts should carefully consider the
context for their technical work before starting to build a simulation model.
It is common for analysts to complain that, though their work was excellent,
it was never used or implemented because of what they refer to, somewhat
dismissively, as “organizational politics.” Rather than dismiss such politics,
which some people regard as part of any organization, it is better to use
methods that help an analyst to understand how the power and interests of
different stakeholders can affect the outcome of their work. That is, analysts
need to develop skills that enable them to accommodate to organizational
231
232 Conceptual Modeling for Discrete-Event Simulation
The programs for simulation events such as the annual Winter Simulation
Conference (http://www.wintersim.org/) also focus on the same top-
ics, though also include reports of work in particular application domains
such as manufacturing, health care, aerospace, criminal justice, or another
domain.
When starting to work in any application domain, a simulation analyst
needs to take her understanding of modeling, statistics and computing
and bring them to bear in the domain. When starting work in a domain
that is wholly new to them, all simulation analysts experience some
confusion in which they are unsure what level of detail is required and
Making Sure You Tackle the Right Problem 233
what should be included or excluded from the model and the modeling.
Obviously, this becomes easier with experience in the application domain,
as the analyst learns which features are important. However, there is also
a danger that an analyst with long experience in an application domain
starts to take things for granted that, later, turn out to be important.
Hence it is important that analysts carefully consider what elements
should be included in their study—no matter how familiar they are with
the domain.
This chapter presents, briefly, some of the main ideas in problem structur-
ing and discusses how they can be useful in conceptual modeling. It intro-
duces an informal approach to problem structuring and lists some of the
formal approaches advocated in the literature and then continues with a
more detailed exposition of soft systems methodology (SSM). Finally, it uses
a real-life simulation study conducted for a UK police force to show how
aspects of problem structuring methods (PSMs) can be useful in practice.
Fig ur e 9.1
Two views of problem structuring.
the world and what they are hoping to achieve from the project. It continues
through the project right up to implementation, supposing that this occurs,
but is the main focus of the early stages of a typical simulation project. Pidd
and Woolley (1980) report that though OR analysts were concerned to prop-
erly structure the problems they were tackling, there was no real evidence of
them using formal methods to help with this.
The second use of the term relates to tackling “wicked problems” (Rittel
and Webber 1973). These are characterized by clashing objectives, a short-
age of data, and multiple stakeholders, who may have very different opin-
ions from other stakeholders on what is desirable. Such wicked problems
are, in essence, unsolvable, in the sense complete resolution or improve-
ment. However, it is usually still possible to make progress in their resolu-
tion by structuring the interrelated issues in such a way that stakeholders
can hold an intelligent debate about what might be done. The formal PSMs
described in Rosenhead and Mingers (2001) are techniques and approaches
that can be used to structure such debate and discussion. This type of prob-
lem structuring is a deliberate contrast with the idea of problem solving,
since there is no assumption that problems can be solved in any permanent
sense, rather the aim is enable stakeholders to make progress when faced
with wicked problems. This second use of the term problem structuring is
now more common than the first and can be seen as an attempt to intro-
duce procedural rationality Simon (1972, 1976), into tackling wicked prob-
lems. That is, this form of problem structuring provides a systematic way
to collect information, to debate options, and to find some acceptable way
forward.
9.2.1 Complementarity
In recent years, it has become clear that the same methods developed for
structuring wicked problems can also serve as a preliminary to formal
modeling; that is, they can help with the first type of problem structuring. It
might be argued that this amounts to overkill if the simulation project is very
simple and straightforward. However, it is not at all unusual for what seems
Making Sure You Tackle the Right Problem 235
like a simple simulation project to become more and more complex as work
proceeds. This can happen for many reasons as stakeholders become aware
that work is underway and, not unreasonably, wish to have their voice heard.
Hence, the argument of this chapter is that conducting formal problem struc-
turing is valuable in almost all simulation projects.
The subject of this book is conceptual modeling, which is the process
of understanding what might be included in a model and representing
this in a way that is relatively independent of the simulation software
to be used. Some writers (e.g., Robinson 2004, 2008) insist that the con-
ceptual model must always be independent of the software being used,
but that seems too stringent a requirement given the inclusive nature of
much simulation software. Pidd (2009) defines a model as “an external
and explicit representation of part of reality as seen by the people who
wish to use that model to understand, to change, to manage, and to con-
trol that part of reality.” Since it is only part of reality, a model will always
be a simplification; some things will be included, others will be excluded.
Before deciding what should be in and what should be out, it makes sense
to consider the context of the proposed work and this is the role of PSMs
in simulation.
When problem structuring approaches are used in combination with
analytical approaches such as computer simulation, it is sensible to regard
the two approaches as complementary (Pidd 2004). It is, though, impor-
tant to realize that such complementary use is based on the mixing of
paradigms and methodologies (Mingers and Gill 1997) and that care is
needed when doing so. Detailed discussions of this complementary use
can be found in Pidd (2004), which reports on a research network involv-
ing both academics and practitioners in the UK established to consider
the difficulties and challenges. Kotiadis and Mingers (2006) discuss some
of the challenges faced when attempting to link PSMs with “hard” OR,
specifically with discrete-event simulation modeling in health care and
is optimistic about such complementarity. Pidd (2009) compares and con-
trasts formal PSMs with more classical management science techniques,
including simulation.
From this point on the term problem structuring applies to the use of sys-
tematic approaches to help diagnose a problem and understand the main
issues as a prelude to detailed simulation modeling. The aim is to find
ways to implement John Dewey’s maxim (quoted in Lubart 1994): “A prob-
lem well put is half solved.” It seems as if he had in mind that a poorly
posed problem will be very hard, if not impossible, to solve—as expressed
in the title of this chapter: making sure that you tackle the right prob-
lem. There can, of course, be no guarantee of this, but problem structuring
approaches can help reduce the risk of working on the wrong problem.
As with simulation modeling itself, users of PSMs grow more expert in
their use as their experience develops. There is, though, no silver bullet,
no magic formula that will guarantee the correct diagnosis of a problem
236 Conceptual Modeling for Discrete-Event Simulation
in such a way that the right simulation model is built and that this is used
appropriately.
These make a very good starting point for considering the main aspects of a
problem for which a simulation approach is being considered.
The first question in the verse revolves around what. Of course, there are
many different questions that could be asked, which begin with what. The
most obvious and one for which there is rarely a straightforward answer
without working through all six questions is, “What’s going on?” or “What’s
the problem we need to work on here?” It is perhaps better to ask, “What are
the main issues that concern people?” In a manufacturing simulation, these
might include some or all of cost reduction, uniform high quality, integrat-
ing work centers, or reducing stocks. In a simulation of a call center, they
might include some or all of meeting performance targets for answering
calls, establishing equipment needs, designing a call routing system, and
determining a shift pattern. Note that these issues are rarely independent
and may be in conflict with one another. At the early stage of a simulation
project, it is important to simply identify these issues and to keep them in
mind as part of the development of a conceptual model.
The second question starts with why. Perhaps the most common variants
on this are to ask, “Why are these issues important?” “Why do particular
people think these are important?” and “Why is this important now?” Of
course, the latter two questions spill over into the who and when questions.
It is not unusual for problems to be known, but not tackled. Sometimes there
is good reason for this—there are just more important things to be done,
or people have found workarounds that have been good enough. It is very
common for answers to the why questions to become more subtle and com-
plex as the work proceeds. Hence it is best to regard problem structuring as
something that goes on throughout a project.
Experienced modelers know that they sometimes only have a real appre-
ciation of the problem they are tackling when the work is complete. It was
Making Sure You Tackle the Right Problem 237
this realization that led Pidd and Woolley (1980) to conclude that this form of
problem structuring is characterized by four features:
With this in mind, the third informal question asks when and concerns
the time dimension. Typical examples might be: “Is this a once-off problem
or one that recurs?” or “Has this been a problem for some time but only
recently become important enough for action?” or “When will the model be
needed?” or “When will the changes to the systems need to be implemented
and properly working?” The first two relate to the earlier why questions
and the latter two give some idea of the resources that will be needed to
do the work and of the level of detail that can be achieved in the model. If
the model needs to be built and tested in a couple of weeks, it is unlikely to
include much detail.
The fourth informal question asks how. The first common example asks:
“How am I going to model this?” referring to the technical approach that
may be needed. The second common example asks: “How did all of this start
to emerge?” Clearly this and the other five “honest working men” are close
relatives or friends, and in this form it relates closely to the who and when
questions. But it also relates to the what question in facing up to how things
are done at the moment or how people might envisage things to operate in
the future. This depends both on the analyst’s reflection and deliberation
and also on the opinions of the people who are interviewed at this stage of
the work.
Fifth, we can ask the where questions. Often these are less important when
taken at face value, for the location of the system of interest may be obvious.
However, even this should not be taken for granted. Location can be very
important now that instantaneous electronic communication around the
world is available at low cost. Tasks that once had to be located in one par-
ticular place may now be located elsewhere in the world. Examples include
238 Conceptual Modeling for Discrete-Event Simulation
Perceived real-world
problem situation
Models of selected
concepts of
purposeful activity
from the perspective
of declared worldviews
Action to
improve “Comparison”
(structured debate
about change)
Seeking
Accommodations
enabling action to be taken
Fig ur e 9.2
An overview of soft systems methodology. (Adapted from Checkland, P.B. and Holwell, S.,
Systems Modelling: Theory and Practice, John Wiley & Sons Ltd., Chichester, UK, 2004. Used with
permission.)
• Boundaries: Some things are inside the system, others are not and
constitute the environment of the system. Note, though that the
boundary may not be obvious. For example, in a call center, is the
location from which someone calls to be part of the model?
• Components: There is more than a single element within the bound-
ary. A boundary that contains nothing is not a system and nor is a
boundary that contains a single element.
• Internal organization: The elements are organized in some way or
other and are not just chaotic aggregations.
• Behavior: The system is recognized as such because it displays
behavior that stems from the interaction of its components; that is,
this behavior is not just from those individual components.
• Openness: The system boundary is permeable in both directions and
there is communication and interaction across the boundary. The
242 Conceptual Modeling for Discrete-Event Simulation
This view of a human activity system, for which Checkland prefers the
term holon, is somewhat wider than the classic engineering view of a sys-
tem as something designed to achieve a purpose in that it incorporates
the idea of human activity and human intent, recognizing that these are
crucial to success. The stacked rectangles in Figure 9.2, labeled as “mod-
els of selected concepts of purposeful activity from the perspective of
declared worldviews,” do not imply that such models, or human activity
systems, actually exist or even could exist. These are conceptualizations
that serve to illustrate how things might ideally exist, and the idea is to
understand what action might be taken, by those involved, to improve
things.
The large cloud represents a perceived, real-world problem situation. In
many SSM studies, this is the starting point of the work and this is likely
to be the case if the SSM is used as a prelude to detailed modeling, pos-
sibly using simulation. The term real-world problem situation is carefully
chosen. The word perceived is used because a study always begins with
a recognition that something needs to be done; that is, some situation is
unsatisfactory now or a system needs to be designed or reconfigured for
the future. Since there are often different stakeholders (including the client
and analyst), the perceptions of those people matter and different stake-
holders may perceive things rather differently. However, SSM is not pri-
marily intended for philosophical use, but for the world of action and in
which something must be done. Hence, this is a real-world problem that
needs to be tackled.
In this chapter, the main focus is the use of PSMs in conceptual modeling.
That is, Type I problem structuring (Figure 9.1), which is a prelude to more
formal mathematical or computer representations of a system of interest.
Hence, it focuses on the role SSM in understanding how stakeholders view
Making Sure You Tackle the Right Problem 243
Increasing
interest
The context
The crowd
setters
monitor
keep informed
Increasing
power
Fig ur e 9.3
A power-interest grid.
Making Sure You Tackle the Right Problem 245
during and after the project. It should be clear that the players are crucial,
since they have high power and interest. However other stakeholders must
not be ignored and a stakeholder analysis is always profitable and need not
take long.
Following investigation of the problem situation, Figure 9.2 shows that
an SSM study requires the construction of models of purposeful activity
from the declared worldviews. Two aspects of this merit discussion here.
First, it is important to realize what is meant by a model in SSM, since this
is not the same as a simulation model. A model in SSM is something that
captures the essential activity needed in an idealized implementation of
the system of interest. These are usually developed from root definitions,
which is a concept discussed later in this chapter. Second, note the refer-
ence to declared worldviews. The aim of the social and political analysis is
to understand the different worldviews of the people and groups involved
in the problem situation. SSM takes for granted that there may be different
worldviews—that is, people may legitimately disagree about the ends and
means of a study. The different viewpoints are teased out and represented
in root definitions.
s takeholders may have of the study itself. The main stakeholders in this
study were these:
• The admin branch of the police force who had asked for help from
the simulation team. The admin branch is best regarded as the
crowd in terms of their power and interest, since though they set up
the study, the outcome does not directly affect them, and they have
limited ability to change things, except through other people.
• The police authority, which is a governance structure that, in the UK,
has responsibility for ensuring that the police force is accountable
to the government and to the population. In terms of the power-
interest grid, the police authority is best regarded as a context setter,
since it is accountable for expenditure and performance yet has no
detailed interest in the working of the CaRCs.
• Members of the public clearly have a great interest in the perfor-
mance of the CaRCs, but have no real direct power to do anything
about them and they are best regarded as subjects whose interests
need to be protected.
• The operators who worked in the CaRCs, answering calls, and decid-
ing what resources were needed to resolve a situation and these are
also subjects, since they have a great interest in working conditions
but little direct power to affect the outcome.
• The senior officers who are responsible for the operation and per-
formance of the CaRCs, who are best regarded as the players, since
they do have power to change things as well as having very major
interests in those changes.
Thus, seen in these terms, the CaRCs are a system that takes calls from the
public and provides an appropriate and timely response for the benefit of
the public who see such a response as necessary. The CaRC is run by the
police force using staff and officers who operate within defined budgets
using available technology.
As a slight contrast, discussions with the senior officers who manage the
CaRCs may lead to a root definition something like the following.
• Customers: it is possible that the managers of the CaRCs might see the
police force itself as the customer, since the CaRCs are part of respon-
sive policing, in which appropriate resources should be deployed to
incidents in a timely manner. This does not mean that these officers
would ignore the needs of the public, but they may have different
customers in mind.
• Actors: it seems likely that managers would regard the staff and
officers of the CaRCs as the principal actors.
• Transformation: as mentioned in the discussion of customers, the
transformation might be to turn information from the public into
responsive policing.
• Weltanschauung: in the light of the previous elements, a worldview
that makes sense is that the police force must engage in responsive
policing.
• Ownership: since the CaRCs are funded through the police budget, it
is clear that the owner is the police force itself.
• Environmental constraints: the CaRCs must operate within defined
budgets, using available technology and responding in such a way
as to provide an appropriate level of service.
Thus, in these terms, the CaRCs are needed to support responsive policing
and are organized so as to provide a good responsive service, operated by
Making Sure You Tackle the Right Problem 249
staff and officers within budget and technology constraints and owned by
the police force.
Acknowledgments
This chapter is based on an advanced tutorial delivered at the 2007 Winter
Simulation Conference and published as Making sure you tackle the right
problem: Linking hard and soft methods in simulation practice. In Proceedings
of the 2007 Winter Simulation Conference, ed. S.G. Henderson, B. Biller,
M.-H. Hsieh, J. Shortle, J.D. Tew, and R.R. Barton, 195–204. 9–12 December,
Washington, DC.
References
Baldwin, L.P. T. Eldabi, and R.J. Paul. 2004. Simulation in healthcare management: A
soft approach (MAPIU). Simulation modeling practice and theory 12: 541–557.
Checkland, P.B. 1981. Systems Thinking, Systems Practice. Chichester: John Wiley &
Sons, Ltd.
252 Conceptual Modeling for Discrete-Event Simulation
Checkland, P.B. 1999. Systems Thinking, Systems Practice: Includes a 30-Year Retrospective.
Chichester: John Wiley & Sons, Ltd.
Checkland, P.B., and J. Poulter. 2006. Learning for action: A short definitive account
of soft systems methodology, and its use practitioners, teachers and students.
Chichester: John Wiley & Sons Ltd.
Checkland, P.B., and J. Scholes. 1999. Systems Methodology in Action, 2nd edition.
Chichester: John Wiley & Sons Ltd.
Checkland, P.B., and S. Holwell. 2004. “Classic” OR and “soft” OR: An asymmetric
complementarity. In M. Pidd, Systems Modeling: Theory and Practice. Chichester:
John Wiley & Sons Ltd.
Conklin, J. 2002. Dialog mapping. http://cognexus.org/index.htm. (Accessed, March
2010).
Den Hengst, M., G-J. de Vreede, and R. Maghnouji. 2007. Using soft OR principles for
collaborative simulation: a case study in the Dutch airline industry. Journal of the
Operational Research 58: 669–682.
Eden, C., and Ackermann F. 1998. Making Strategy: The Journey of Strategic Management.
London: Sage Publications.
Gunal, M.M., S. Onggo, and M. Pidd. 2007. Improving police response using simula-
tion. Journal of the operational research society 59: 171–181.
Harrell, C.R., B.K. Ghosh, and R.O. Bowden. 2004. Simulation Using ProModel, 2nd
edition. New York: McGraw-Hill Professional.
Howick, S. 2003. Using system dynamics to analyze disruption and delay in complex
projects for litigation: Can the modeling purposes be met? Journal of the opera-
tional research society 54: 222–229.
Kelton, W.D., R.P. Sadowski, and D.T. Sturrock. 2004. Simulation with Arena.
New York: McGraw-Hill.
Kolb, D.A. 1983. Problem management: Learning from experience. In The Executive
Mind, ed. S. Srivasta. San Francisco, CA: Jossey-Bass.
Kotiadis, K. 2007. Using soft systems methodology to determine the simulation study
objectives. Journal of simulation 1: 215–222.
Kotiadis, K., and J. Mingers. 2006 Combining PSMs with hard OR methods: The
philosophical and practical challenges. Journal of the operational research society
57: 856–867.
Law, A.M. 2006. Simulation Modeling and Analysis, 4th edition. New York: McGraw-
Hill.
Lee, J., and K.C. Lai. 1991. What’s in design rationale? Human-computer interaction 6
(3&4): 251–280.
Lehany, B., and R.J. Paul. 1996. The use of SSM in the development of a simulation of
out-patients at Watford General Hospital. Journal of the operational research society
47: 864–870.
Lewis, P.A.W., and E.J. Orav. 1989. Simulation Methodology for Statisticians, Operations
and Analysts and Engineers, Volume 1. Pacific Grove, CA: Wadsworth & Brooks/
Cole.
Lubart, T.I. 1994. Creativity. In R.J. Steinberg (ed.), Thinking and Problem Solving,
2nd edition. London: Academic Press.
Mingers, J., and A. Gill. 1997. Multi Methodology. Chichester: John Wiley & Sons Ltd.
Mingers, J., and S. Taylor. 1992. The use of soft systems methodology in practice.
Journal of the Operational Research Society 43(4): 321–332.
Making Sure You Tackle the Right Problem 253
Paul, R., and B. Lehany. 1996. Soft modeling approaches to simulation model
specifications. In Proceedings of the 1996 Winter Simulation Conference, 8–11
December, Hotel Del Coronado, Coronado, CA. Baltimore: Association for
Computing Machinery.
Pidd, M. 2004. Computer Simulation in Management Science, 5th edition. Chichester:
John Wiley & Sons Ltd.
Pidd, M. 2010. Tools for Thinking: Modeling in Management Science, 3rd edition.
Chichester: John Wiley & Sons Ltd.
Pidd, M., ed. 2004. Systems Modeling: Theory and Practice. Chichester: John Wiley &
Sons Ltd.
Pidd, M., and R.N. Woolley. 1980. A pilot study of problem structuring. Journal of the
operational research society 31: 1063–1069.
Rittel, H.W.J., and M.M. Webber. 1973. Dilemmas in a general theory of planning.
Policy Sci 4: 155–169.
Robinson, S. 1994. Successful simulation: A practical approach to simulation projects.
London: McGraw-Hill.
Robinson, S. 2004. Simulation: The Practice of Model Development and Use. Chichester,
UK: John Wiley & Sons Ltd.
Robinson, S. 2008. Conceptual modelling for simulation part I: Definition and
requirements. Journal of the operational research society 59(3): 278–290.
Rosenhead, J.V. 1989. Rational analysis for a problematic world: Problem structur-
ing methods for complexity, uncertainty and conflict. Chichester: John Wiley
& Sons Ltd.
Rosenhead, J.V., and J. Mingers (eds.). 2001. Rational Analysis for a Problematic World
Revisited. Chichester: John Wiley.
Sachdeva, R., T. Williams, and J. Quigley. 2007. Mixing methodologies to enhance
the implementation of healthcare operational research. Journal of the operational
research society 58: 159–167.
Shaw, D., A. Franco, and M. Westcombe (eds.). 2007. Special issue: Problem
structuring methods II. Journal of the operational research society 58: 545–700.
Simon, H.A. 1972. Theories of bounded rationality. In H.A. Simon (1982), Models of
Bounded Rationality: Behavioural Economics and Business Organization. Cambridge,
MA: MIT Press.
Simon, H.A. 1976. From substantive to procedural rationality. In H.A. Simon (1982),
Models of Bounded Rationality: Behavioral Economics and Business Organization.
Cambridge, MA: MIT Press.
Wilson, B. 1990. Systems: Concepts, Methodologies, and Applications, 2nd edition.
Chichester: John Wiley & Sons Ltd.
10
Using Soft Systems Methodology in
Conceptual Modeling: A Case Study
in Intermediate Health Care
Kathy Kotiadis
Contents
10.1 Introduction............................................................................................... 256
10.1.1 The Conceptual Modeling Processes....................................... 256
10.1.2 The Use of SSM in Knowledge Acquisition and
Abstraction................................................................................... 258
10.2 The Case Study: Intermediate Health Care.......................................... 259
10.2.1 A Brief Look at SSM in General................................................ 260
10.2.1.1 Rich Pictures............................................................... 261
10.2.1.2 Analyses One, Two, and Three................................ 261
10.2.1.3 The Purposeful Activity Model............................... 262
10.2.2 Applying SSM to the IC Health System................................... 263
10.2.2.1 Knowledge Acquisition Using Analyses One,
Two, and Three........................................................... 263
10.2.2.2 Knowledge Acquisition Using Rich Pictures........ 265
10.2.2.3 Abstraction Using CATWOE, Root Definition,
Es, and PAM............................................................... 266
10.2.2.4 Determining the Simulation Study Objectives..... 268
10.3 Using SSM to Determine the Simulation Objectives........................... 269
10.3.1 Can SSM be Adapted?................................................................ 270
10.3.2 What are the Benefits of using SSM?....................................... 272
10.4 Summary.................................................................................................... 273
Acknowledgments............................................................................................... 274
References.............................................................................................................. 275
255
256 Conceptual Modeling for Discrete-Event Simulation
10.1 Introduction
This chapter explores how soft systems methodology (SSM), a problem
structuring method, can be used to develop an understanding of the prob-
lem situation and determine the simulation study objectives based on the
experience gained in a real life simulation study in health care. Developing
an understanding of the problem situation and determining the model-
ing objectives are two of the four phases of Robinson’s (2004) conceptual
modeling. Robinson (2004) divides conceptual modeling into the following
phases:
Robinson’s (2004) phases can also be used to describe the output of the
conceptual modeling processes. There are two main processes involved in
conceptual modeling: knowledge acquisition and abstraction (Kotiadis and
Robinson 2008). This chapter explores how SSM contributes to these pro-
cesses in order to get an understanding of the problematic situation and
determine the study objectives.
This chapter is divided into four main sections. The introduction with its
remaining subsections form the first section. The first section explores the
conceptual modeling processes that the case study contributes toward and
reflects on the appropriateness of SSM to conceptual modeling by looking at
what others have said when they used SSM in their simulation study. The sec-
ond section explores the case study, which is broken down into subsections.
These subsections include a description of the problem and the motivation
for using SSM, a brief description of SSM in general and how it was applied
to this case study in terms of knowledge elicitation and abstraction. However
special attention is paid on how SSM was conducted and extended to deter-
mine the simulation study objectives and the section concludes with a set of
guidelines. The third section provides a discussion about the opportunity to
further adapt SSM to determine the study objectives and the benefits of the
proposed approach. The final section provides a summary of the chapter.
Problem domain
(Simplifications)
box validation
Abstraction
Computer Model design Conceptual
model and coding model
Model domain
Fig ur e 10.1
Artifacts of conceptual modeling. (From Kotiadis, K. and Robinson, S., Proceedings of the 2008
Winter Simulation Conference, Institute of Electrical and Electronic Engineers, Inc., Miami, FL,
2008. With permission.)
main tools used to assist the modeler in knowledge acquisition (finding out
about the problem situation): drawing rich pictures and analyses one, two,
and three. Pidd (2007) provides an in-depth discussion on the latter. The
CATWOE, root definition, and performance measures (three Es) are tools
that contribute toward the process of constructing the PAM, which is an SSM
model. The PAM contributes toward the process of abstraction. The remain-
der of this subsection provides some reflections on these tools.
Activities supporting
the transformation 7. Assess each
1. Determine process patient
local needs 2. Assess
current 8. Determine if they
provision meet SEEC for service
3. Know the
funding 4. Decide and set up 9. Refer
patient to 10. Assign
if necessary services patient to
5. Determine service
SEEC for each waiting list
service 6. Produce information 11. Provide
about IC services
C. Monitor J. Determine if
capacity and more resources
A. Define are needed
measures of resources Performance
performance M. Take
B.Determine I. Determine if new measurement
Action
monitoring D. Monitor services are needed activities
activities Funding
H. Determine if
E. Monitor processes need
L. Determine
DoH improving
if SEEC need
guidelines modifications
G. Monitor if people
F. Monitor K. Monitor
admitted to IC
SEEC referral
services pass SEEC
Fig ur e 10.2
The purposeful activity model of the IC system. (From Kotiadis, K., Journal of Simulation, 1, 215–222, 2007. With permission.)
Conceptual Modeling for Discrete-Event Simulation
Using Soft Systems Methodology in Conceptual Modeling 265
the analysis of the roles we were able to understand what action some key
stakeholders were prepared to undertake within the system. This also meant
knowing who to persuade to organize meetings with other stakeholders to
obtain information. Undertaking the social analysis enabled a better under-
standing of the stakeholders within the system and enabled the modeler to
align herself to the culture of this health and social care system and through
interaction gain access to information and insights. Also some behavior was
directly or indirectly included in the simulation model.
their operational functions. SSM was able to deal with this because it enabled
action research to take place. More specifically, the process of action research
comprises of enquiry, diagnosis, action planning, action/intervention, evalu-
ation and learning (Hart and Bond 1995). The stakeholders, who were aware
of this lack of system integration, were interested in action research, which
means they were willing to take action to improve the system during the
study and not just as a result of the findings of the study. Therefore, it was
sensible to aim at building a simulation model of a future integrated system
rather than of the current situation and use SSM to determine what was con-
sidered by the stakeholders to be a desirable and feasible future system.
In terms of this research the three measures of performance, or the 3 Es, are
the following:
The criteria, Ethicality and Elegance (Section 10.2.1), were not defined in this
study, as they were not considered to add to the evaluation of our system.
The measures of performance were broken down into a number of activi-
ties and incorporated in the monitoring activities part of the PAM of the IC
system (activities A–M in Figure 10.2). The reader should note that this is not
common practice in stand alone SSM studies.
The Root Definition, CATWOE, and the 3 Es guided the construction of
the activity model that aims to show the transformation process T (activities
1–11 in Figure 10.2). The process of building the activity model “consists of
assembling the verbs describing the activities that would have to be there
in the system named in the RD and structuring them according to logical
dependencies” (Checkland, 2001, p. 77). Checkland (1999a, p. A26, Figure. A6)
provides a set of guidelines in constructing the PAM. Information supplied
268 Conceptual Modeling for Discrete-Event Simulation
by the stakeholders or observed during the first SSM stage (finding out about
the problem situation), was used to determine the activities essential to the
SSM transformation process.
The core PAM of the IC system can be seen in top part of Figure 10.2 (activi-
ties 1–11) and the right-hand section of this (activities 7–11) is closest to the
computer model. Therefore the PAM is a simplification of the system descrip-
tion, but also with further abstraction provides the model description. More
specifically, the PAM includes all the main IC operational activities (simpli-
fication/reduction in the level of detail in the conceptual model from that of
the system description), but also describes in a focused way what actually
takes place in the computer model (reduction in the scope of the concep-
tual model from that of the system description). Therefore, through this level
of abstraction part the conceptual model can be derived from this simple
representation of the IC system.
study objectives as they relate to the process of building the PMM. The
remaining PMM activities were logically grouped into the following ques-
tions that formed the simulation study objectives:
If SSM had not been deployed, no doubt the first objective regarding capac-
ity would have been derived, since it is a typical simulation study question
(Davies and Roderick 1998, Jun et al. 1999). In this study, capacity is exam-
ined by including all the places/beds available for each IC service in the
whole s ystem simulation model and monitoring queues.
The second and third questions, however, are more original and can be
attributed to the use of SSM. To answer the second question the model emu-
lates the decision making process using a rule base that determines the
service each patient should be sent to, based on a large number of patient
characteristics (attributes). At the end of a run one can see if a particular
patient or group of patients entered the service that they had actually entered
in real life. The model is able to answer the third question by determining
whether there is an appropriate service for each level of IC need by exam-
ining if there are gaps in the services mix. For example, it can be used to
examine the effects of adding a new service or removing an existing service
(Kotiadis 2006).
author’s knowledge, is not a step in stand alone SSM studies but could be a
useful extension when used in simulation studies as there is a clear opportu-
nity to map out the operational activities when done with stakeholders that
may not easily distinguish the difference between the two. Another ben-
efit of including the strategic level activities is that these provide the system
owners perspective, which can lead to a PMM and subsequently objectives
more aligned to their needs.
In this study, greater emphasis was placed on the performance criteria
than is usually placed in other SSM studies (Checkland and Scholes 1999,
Wilson 2001, Winter 2006). The core PAM (strategic and operational) activi-
ties were also linked with an extended model of performance criteria that
are referred to as the PMM, which is to this author’s knowledge again unre-
ported as a step in the SSM literature. This stage was largely internalized
and emerged after a series of discussions and reflections when there was a
reasonable correspondence between the two, i.e., the PMM activities would
satisfy the needs of the strategic and operational activities. Based mainly
on the experience gained from this study, it is proposed that the following
generic guidelines can be used by others to construct the PMM and arrive at
the simulation study objectives:
1. Find out how the performance criteria developed relate to the real-
life situation. Reflect on how each activity, supporting the transfor-
mation process in the PAM, can be evaluated.
2. Break down the performance criteria into specific monitoring activi-
ties, which are activities that involve observing and recording infor-
mation. Where possible these activities should be in the format
“monitor….”
3. Consider what action might be taken based on each of the moni-
toring activities or their combinations. Where possible record this
action in the format “determine if….”
4. Where possible try and list the monitoring activities first and then
link them according to logical dependencies to the “determine if”
activities. Similar to the core PAM, circle each activity in the PMM
and if helpful assign each a letter of the alphabet (rather than a num-
ber used in the core PAM).
5. Consider each of the performance measurement activities and deter-
mine which can be evaluated in a simulation model. These selected
performance measurement activities can form simulation study
objectives, but if necessary group these activities and relabel them to
form simulation study objectives.
However, the PMM and process to derive the PMM can be further adapted
or modified in order to better support the particular needs of a simulation
study.
272 Conceptual Modeling for Discrete-Event Simulation
10.4 Summary
This chapter set out to explain how SSM, a problem structuring method, can
be used to develop an understanding of the problem situation and determine
the simulation study objectives based on the experience gained in a real life
simulation study in health care. Developing an understanding of the prob-
lem situation is the output of the conceptual modeling process of knowledge
elicitation and determining the simulation study objectives is part of the con-
ceptual modeling process of abstraction that leads to the computer model.
Figure 10.3 depicts the relationship in general of the SSM tools and some of
the artifacts of conceptual modeling discussed in this chapter. The following
paragraph provides an explanation of Figure 10.3.
The SSM tools that can be of use to the simulation modeler are (a) rich
picture drawing, (b) analyses one, two, and three, (c) CATWOE and root
definition(s), (d) the performance measures (3 Es), and (e) the PAM. The SSM
tools a, b, and c can help structure the process of knowledge acquisition and
the output of these tools can be produced with the stakeholders and provides
and agreeable view of the problematic situation. In Figure 10.3 there arrows
going in both directions to represent the output of the process being depos-
ited in the stakeholders’ minds as well as the SSM tools. The PMM is another
tool, not listed as an SSM tool, as it is an extension to the usual SSM approach
(3 Es) that provides the opportunity to abstract the simulation study objec-
tives and to some extent the inputs and outputs. In this chapter guidelines are
provided on how to go about constructing the PMM. In addition to the PMM
extension, the PAM is also constructed in a particular way; the PAM lists
activities that are broken down to strategic and operational level activities.
The computer model content can be abstracted at a high level from the opera-
tional level activities. However the objectives, inputs and outputs, derived
from the PMM, also inform the construction of the computer model.
Using SSM in conceptual modeling provides structure and transparency
to the process of knowledge acquisition and abstraction and paves the way
for stakeholder participation and ultimately acceptance of the simulation
study finding and implementation of the recommendations.
274 Conceptual Modeling for Discrete-Event Simulation
SSM Tools
Knowledge Real world
Rich pictures elicitation
Computer model
Analysis one,
two, and three Simulation
model content
CATWOE &
A
bs
Root
tr
ac
definition
tio
n
Core PAM
Strategic level Operational level
Efficacy, activities activities Simulation
effectivenes, study
efficiency objectives,
(3Es) inputs and
Performance measurement outputs
model
Fig ur e 10.3
Using SSM in conceptual modeling.
Acknowledgments
This chapter is reproduced, with major editing, from: Kotiadis, K. 2007.
Using soft systems methodology to determine the simulation study objec-
tives. Journal of simulation 1: 215–222.© 2007 Operational Research Society
Ltd. Reproduced with permission of Palgrave Macmillan.
Some sections of this chapter are based on the following:
Kotiadis, K., and S. Robinson S. 2008. Conceptual modeling: Knowledge
acquisition and model abstraction. In Proceedings of the 2008 Winter Simulation
Conference, ed. S.J. Mason, R. Hill, L. Moench, O. Rose, T. Jefferson, and
J.W. Fowler, 951–958. Miami, FL: Institute of Electrical and Electronic
Engineers, Inc.
References
Baldwin, L.P., T. Eldabi, and R.J. Paul. 2004. Simulation in healthcare management:
A soft approach (MAPIU). Simulation modelling practice and theory 12 (7–8):
541–557.
Büyükdamgaci, G. 2003. Process of organizational problem definition: How to
evaluate and how to improve. Omega 31: 327–338.
Using Soft Systems Methodology in Conceptual Modeling 275
Contents
11.1 Introduction............................................................................................... 279
11.2 The Systems Modeling Language (SysML)........................................... 282
11.2.1 A Brief History............................................................................ 282
11.2.2 The SysML Diagrams and Concepts........................................ 283
11.2.3 Reported Strengths and Weaknesses of SysML..................... 285
11.2.4 SysML Tools................................................................................. 288
11.3 SysML and Simulation............................................................................. 289
11.3.1 Conceptual Modeling with SysML: An Example................... 291
11.4 Challenges for SysML-Based Conceptual Modeling........................... 301
11.5 Conclusions................................................................................................ 303
References..............................................................................................................304
11.1 Introduction
First published in September 2007, the Systems Modeling Language (SysML)
is a recent language for systems modeling that has a growing community of
users and advocates in the field of systems engineering. This chapter gives
an overview of SysML, identifies why it is of interest to the simulation com-
munity, and evaluates the feasibility of using this standard to support the
conceptual modeling step in the discrete-event simulation (DES) process.
It has been recognized for many years that conceptual modeling is an
extremely important phase of the simulation process. For example, Oren
(1981) noted that conceptual modeling affects all subsequent phases of a
simulation project and comprehensive conceptual models are required for
robust and successful simulation models. Costly development time can be
greatly reduced by clearly defining the goals and content of a model during
the precoding phase of a simulation study. Despite this acknowledged impor-
tance, relatively little research has previously been carried out on the topic of
conceptual modeling, as highlighted earlier in Chapter 1 of this book.
279
280 Conceptual Modeling for Discrete-Event Simulation
Over the years a number of frameworks for conceptual modeling have been
suggested with the aim of bringing standardization to what is perceived by
many to be more of an art than a science (Kotiadis 2007). One of the earli-
est frameworks was put forward by Shannon (1975), which consists of four
steps. The first step is to specify the model’s purpose, the second is to specify
the model’s components, the third is to specify the parameters and variables
associated with the components, and the fourth is to specify the relation-
ships between the components, parameters and variables. While these steps
are still valid today, alternative frameworks have been presented in the years
since then that refine the steps and/or focus on different aspects of the con-
ceptual model. Examples include Nance (1994), Pace (1999), and van der Zee
and Van der Vorst (2005), among others. The most recent modeling frame-
work, presented by Robinson (2008), draws more attention to the goal of the
model by encouraging the modeler to explicitly identify the model outputs
and inputs prior to considering content. The steps of this framework are as
follows:
• Petri Nets (Vojnar 1997, Ou-Yang and Shieh 1999, Balduzzi et al. 2001,
Koriem 2000, Shih and Leung 1997, Evans 1988)
• Activity Cycle Diagrams (ACDs) (Richter and Marz 2000, Shi 1997)
• Discrete Event Specification System (DEVS) (Rosenblit et al. 1990,
Thomasma and Ulgen 1988)
An Evaluation of SysML to Support Simulation Modeling 281
ABCmod (Birta and Arbez 2007) and Simulation Activity Diagrams (Ryan
and Heavey 2007) are model-based approaches that have been developed
specifically for conceptual modeling. These techniques are discussed in
detail in Chapter 6 and Chapter 12, respectively.
A further standard that has been shown to be applicable to conceptual
modeling is Business Process Modeling Notation (BPMN) (Onggo 2009).
This is a graphical modeling approach that is used for specifying business
processes. The notation has been designed to coordinate the sequence of
processes and the messages that flow between the different process partici-
pants in a related set of activities (http://www.bpmn.org/). While this may
address much of the information concerned in a conceptual model, there
are further details that BPMN is not equipped to deal with. For instance
while comparing BPMN with UML, Perry (2006) notes that BPMN is unable
to model the structural view or the requirements of the process. The struc-
tural aspect of a system can be of importance in a conceptual model when
it places a constraint on the system (e.g., logistical implications of relative
location of processing stations). When the requirements and model purpose
information cannot be held within the modeling environment then supple-
mentary documentation is required, thereby diminishing the advantage of a
model-centric approach.
The parallels between simulation modeling and software development are
discussed by Nance and Arthur (2006) and Arthur and Nance (2007) who
investigate the use of software requirements engineering (SRE) techniques
in simulation modeling. A notable contrast in the level of standardization in
the two fields is identified with the authors observing that simulation meth-
odologies differ widely in formality and rigor. One tool that has promoted
standardization in the software development world is the Unified Modeling
Language (UML). This standard is used by software developers to define
and communicate information, particularly in the requirements gathering
and design stages of the software development process. Based on UML, a
new graphical modeling standard called the SysML has been developed to
support systems engineering. This chapter explores how this new standard
could be used to support the conceptual modeling phase of a simulation
project. First, an overview of the SysML standard is given with a brief history
and an introduction to the primary constructs. The use of SysML within the
simulation context is then discussed with reference to other research in the
field and case study work that has been undertaken by the authors. Insights
282 Conceptual Modeling for Discrete-Event Simulation
SysML diagram
Activity Sequence State machine Use case Block definition Internal block Package
diagram diagram diagram diagram diagram diagram diagram
Same as UML 2
Parametric
Modified from UML 2 diagram
New diagram type
Fig ur e 11.1
SysML diagram taxonomy. (From the SysML v1.1 Specification, fig. 4.4 or A.1. Object Management Group, Inc. (C) OMG, 2009. Reprinted with
permission.)
Conceptual Modeling for Discrete-Event Simulation
An Evaluation of SysML to Support Simulation Modeling 285
Looking at the field of simulation each of these stated benefits has relevance
and is desirable. For instance, the type of comprehensive standardization
described by Willard (2007) is noted to be lacking in simulation particularly
in the area of conceptual modeling for DES where, as discussed previously,
standardization is nonexistent. The use of SysML could help overcome inter-
operability issues in relation to simulation modeling tools that are currently
unsupported in this respect. Additionally, it is acknowledged that consider-
able amounts of insightful information are unearthed during the concep-
tual modeling phase of a simulation study. The current difficulty is that this
information is often lost after the project concludes. If the information is held
in reusable sections of a graphical model, it will be available not only for
future simulation projects but also any other type of process improvement
initiative and continual management. This potential use of SysML is dis-
cussed further in Section 11.3.
An identified weakness of SysML is that it gives too much freedom to
the modeler. It is therefore possible for important information to be rep-
resented in an obscure manner in a SysML diagram, which, as Herzog
and Pandikow (2005) point out, could be “easily overlooked by a human
reader.” UML models are true reflections of the systems they represent
since the concepts that are used to develop UML models are also used to
develop the software systems they represent. On the other hand, SysML
models are just abstractions of the systems they represent. Any abstraction
is open to interpretation; the freedom offered to SysML users means that
these abstractions can be developed in various ways thus creating even
further opportunity for confusion and miscommunication, particularly in
larger scale systems.
An Evaluation of SysML to Support Simulation Modeling 287
The TSS Team’s training and engineering results show that a design
team can learn and use SysML in a reasonable amount of time (five stan-
dard workweeks) without significant training or experience. In this way,
Project Quicklook has dispelled the notion that organizations cannot
use model-based systems engineering with SysML because the start-up
resource cost is too high.
While this may be an acceptable training period for people such as sys-
tems engineers who will regularly use the standard, could it be consid-
ered a significant commitment in other cases where the nature of the
users would be different? For instance in a simulation modeling context,
it may be feasible for simulation modelers to learn the standard as they
could continually use it. However, if the modeler is eliciting information
from a manufacturing engineer during a project, it may not be feasible
for the engineer to learn the standard in order to communicate with the
simulation modeler by commenting on, adding to or modifying SysML
diagrams. Noting this, some effort has already been made by SysML tool
vendors to support communication with stakeholders who are unfamil-
iar with SysML, through alternative user interfaces. This, however, entails
additional development effort on the part of the modeler; this topic is dis-
cussed further in the next section.
One of SysML’s greatest strengths is the level of interest that it has received.
The number of industrial partners who have contributed to its development
illustrates practitioner recognition of the need for SysML and an eagerness
to make standardized graphical modeling notation freely available. These
partners include significant industry leaders such as IBM, Lockheed Martin
Corporation, BAE Systems, the Boeing Company, Deere & Company, and
NASA. As noted earlier, tool vendors also showed their support for the lan-
guage by contributing to the development process. Herzog and Pandikow
(2005) highlight that the number of tool vendors involved in drafting the
SysML specification shows that “there exists not only a pull from the market,
but also a push from the vendor community.” This type of across-the-board
support significantly strengthens the likelihood of widespread adoption of
SysML.
288 Conceptual Modeling for Discrete-Event Simulation
These are all the tools that fully support SysML at the time of writing but
the standard is still relatively new and more tools are bound to emerge. For
instance, Visual Paradigm is a UML design tool that has begun to include
some SysML capability but to date has only implemented the Requirements
Diagram. Additionally there is a SysML template available for Microsoft
Visio, however this allows for SysML diagrams to be drawn rather than
SysML models to be created. The distinction here is that SysML diagrams
in Visio will lack the integration and interactivity that a truly model-centric
approach will benefit from.
As can be seen above the majority of tools for developing SysML models
are computer aided software engineering (CASE) tools in which the UML
capabilities have been enhanced to accommodate the SysML specification.
This has significant benefits in terms of providing features that have been
tried and tested in a UML context for longer than the SysML standard has
even existed. The drawback of the history of these tools is that they are
An Evaluation of SysML to Support Simulation Modeling 289
primarily designed for the software market rather than for the broader user
base that SysML is intended for and therefore may not meet the expectations
of all users.
The type of features that these tools offer include integration with SRE
tools to aid population of Requirement Diagrams and repository-based
architectures to support model sharing and multiuser collaboration over
local networks. These tools offer crosscutting functionality, which allows
relationships between elements on different diagrams to be defined thereby
tying the model together. These relationships allow for information that has
been defined in one diagram to be automatically added to another and help
maintain consistency throughout the model. This is the type of integration
that is difficult if not impossible to achieve when using the SysML template
in Microsoft Visio.
In terms of improving the communication of a SysML diagram, many tools
allow users to upload and use images that are more representative of the
model elements. They also allow users to toggle on and off model block com-
partments to hide and show information (e.g., block parameters) as required
and prevent information overload. A number of tools allow for the execu-
tion of model diagrams. This functionality allows the user to step through
the sequence of activities and see how various activities are triggered. This
is useful for initial validation of the model and for subsequently commu-
nicating the details, and demonstrating the functionality, of the system to
others. To further aid communication, particularly to stakeholders who are
unfamiliar with SysML or not from a technical background, a number of
tools allow for alternative graphical and often interactive views to be used.
These non-SysML views are either developed within the tool itself or the tool
is designed to interact with external GUI prototyping tools, as in the cases of
the graphical panels in Rhapsody and the integration of Artisan Studio with
Altia Design, respectively.
Recognizing that document-based reports are still often required, many
SysML tools allow for templates to be automatically populated with infor-
mation about model requirements, content and relationships. Information
about the model can also be exported in XMI format (XMI [XML Metadata
Interchange] is an OMG standard) to allow for communication between dif-
ferent SysML tools. The next section discusses current and potential use of
SysML in the simulation modeling domain.
between the domain-specific language and SysML. This approach has been
demonstrated with the Modelica language (Mattsson et al. 1998) and Paredis
and Johnson (2008) report that further work is being conducted to illustrate
the approach with Matlab Simulink and eM-Plant. This approach of course
is only useful if the system is already described in a formal model. The next
section describes the authors’ experience of building a SysML model from
the perspective of gathering information and defining a conceptual model
for a simulation study.
ProduceMF
Inspect
Fig ur e 11.2
Activity diagram of overall production process.
Diagram (Figure 11.2). Although largely illegible at the scale shown here,
one can see that the general format is not dissimilar to that generally used
in process mapping. Decision nodes for instance are represented by a famil-
iar diamond shaped symbol and the alternative routes are labelled with the
associated decision criteria (see the lower zoomed section of Figure 11.2). A
particularly useful attribute of SysML is the ability to segregate informa-
tion and avoid overly complex diagrams. The activity diagram in Figure 11.2
describes the series of activities that occur from when a customer places a
purchase order until the product is shipped. While this diagram does not
describe in detail every step of what is a complex production process, it does
allow the reader to very quickly get an understanding of what is involved in
fulfilling a sales order. Additional more detailed information is made avail-
able to the reader in further activity diagrams. Here we give the example
of the “Kit Components” activity, which yields the “Component Kit” (see
An Evaluation of SysML to Support Simulation Modeling 293
the upper zoomed section of Figure 11.2) that is used in the subsequent
assembly activity. The “rake” symbol on the upper right-hand corner of the
“Kit Components” activity indicates that there is a more elaborate diagram
associated with this model element. The diagram associated with the “Kit
Components” activity is shown here in Figure 11.3.
SysML activity diagrams can also show how physical objects in the system
interact with the represented activity. For instance in Figure 11.3, it can be
seen that the “Kit Components” activity takes in components (high value and
Kit components
<<block>>
High-value components Pick kit items from secure
storage area and place in
<<block>> container
Kit container
Fig ur e 11.3
Activity diagram of the kitting process.
294 Conceptual Modeling for Discrete-Event Simulation
standard) and a kit container and outputs a component kit. Even with this
straightforward kitting activity, the advantages of compartmentalizing data
in order to prevent information overload can be seen. A further advantage of
this modular approach is that once an activity has been described it is avail-
able to be referenced from any diagram or reused in another model. Indeed
this idea of information reuse and cross referencing is an integral aspect of
SysML modeling and helps ensure consistency across model diagrams.
An example of this type of integration is shown here with the
“StartConveyor” step in Figure 11.3 although as this is essentially a docu-
ment-based description of a model-centric approach the integration is not
apparent. This “StartConveyor” step is in fact an event that had been previ-
ously defined in a State Machine Diagram for the conveyor (see Figure 11.4)
and was added to this diagram. It is therefore the same piece of information
shown in two different diagrams and therefore if it is changed on one it also
changes on the other. The State Machine Diagram in Figure 11.4 shows how
this event causes the conveyor to go from an “idle” state to an “operating”
state. This diagram also shows that this can only happen when the conveyor
is in an active state (i.e., switched on). The Activity Diagram in Figure 11.3 on
the other hand shows when this event occurs in the Kitting Process.
The level of freedom offered when using SysML does bring certain dif-
ficulties to the modeling exercise as it is often unclear which way is best to
represent information. Even in the relatively simple case of describing the
generic types of component required by the manufacturing activity to fulfil
a sales order it is possible to create initially confusing diagrams such as the
Block Diagram in Figure 11.5.
This block diagram shows two types of relationship: the connector with the
white triangle represents the Generalization relationship, which can be read
as “is a type of” (e.g., a Chassis is a type of bulky component) and the connec-
tor with the black diamond represents the Composition relationship, which
Active
Inactive
SwitchOnConvey or/
Idle
StartConvey or/
StopConvey or/
Operating
SwitchOffConvey or/
Fig ur e 11.4
State machine diagram for the conveyor.
An Evaluation of SysML to Support Simulation Modeling 295
<<block>>
<<block>>
Bulky components
High-volume components
<<block>> <<block>>
High-value components Standard components
1..* 1..*
<<block>> 1 1
Chassis
1 <<block>>
Component kit
<<block>> <<block>>
Cover Labels
1..* 1..*
<<block>>
<<block>> Mother board <<block>>
Hard drive Screws
<<block>> 1
1 <<block>>
Door 1..*
Ram
0..1
D 1 <<block>>
R Cable
MB 1
HD1 1 1 1 C
1
1 <<block>> 1 L
CVR CH
1 1 1 1..* Drawer assembly S
DA
<<block>>
Server <<block>> <<block>>
1..* Auxilary material Cardboard
1..* Svr
AM
<<block>> <<block>>
<<block>> Packaging Plastic wrap
1 1
Print
<<block>> <<block>>
Shipment Pallet
<<block>> <<block>>
User manual Warranty information
Fig ur e 11.5
Block diagram for components.
can be read as “is made up of” (e.g., the Server product is made up of one
or more drawer assemblies, one or more covers and possibly a door). Once
the reader is aware of the meaning of the connectors the diagram becomes
less confusing, however, considering that this diagram only shows generic
component types there is potential for it to grow out of control when specific
part numbers are included.
However, once relationships have been entered into a SysML model it is
possible to generate alternative views of the same information. For exam-
ple, the majority of the content of the Internal Block Diagram shown in
Figure 11.6 and the Block Diagram in Figure 11.7 was generated automati-
cally based on the composition relationships defined in Figure 11.5 (the
296 Conceptual Modeling for Discrete-Event Simulation
1..* 1
<<part>> <<part>>
S : Screws C : Cable
1 1
<<part>> <<part>>
CH : Chassis HD : Hard drive
1..*
<<part>>
L : Labels
Fig ur e 11.6
Internal block diagram for server product.
1 1
1
1..* CVR 0..1 D 1..*
<<block>> <<block>> DA <<block>>
Cover Door Drawer assembly
L 1
1..* 1 R 1 MB
<<block>> <<block>> <<block>>
Labels Ram Motherboard
1C 1 CH 1 HD
<<block>> <<block>> <<block>>
Cable Chassis Hard drive
ps
1..* S
<<constraint>>
<<block>>
PalletSizeConstraint
Screws
Constraints
{ProductVolume = ProductHeight * ProductWidth * ProductDepth}
Fig ur e 11.7
Block diagram for server product.
An Evaluation of SysML to Support Simulation Modeling 297
physical connections in Figure 11.6 were added manually and the con-
straint information in Figure 11.7 was based on a Parametric Diagram in
the model) and both of these show more clearly what parts a Server product
is made up of. The important point here is that a SysML model is more than
a collection of pictures it is also the underlying logic that is represented
in the pictures. This logic is valuable knowledge that once entered can be
reused and centrally maintained.
The diagrams presented so far relate primarily to material and material
flows. When examining a system during a simulation project, the flow of
information through the system is also of importance as this often contains
much of the system logic that must be incorporated into the simulation
model. Figure 11.8 shows a sequence diagram that illustrates the connection
between the company receiving a sales order from a customer for a new
product and then placing a purchase order with a vendor for components.
The diagram shows that the information is passed through a number of inte-
grated software systems during this process. While these software systems
may not need to be explicitly represented in a simulation model of the sys-
tem, the logic of determining the production requirement and comparing
this with the available material will need to be captured.
A feature of this model-centric approach that is of particular benefit in the
simulation context is the ability to clone a diagram and edit it. This new dia-
gram can be used to describe proposed design alternatives or experimental
settings, and can be readily compared to the original. Furthermore, SysML
modeling is suited to the concepts of simplifications and assumptions as
used in conceptual modeling for simulation. Take for example the case
shown earlier of the Kitting Process. If it is decided that the objectives of
the model require this process to be modeled in detail, then the lower-level
information in Figure 11.3 is used, if not, then it can be represented as a
single activity as in Figure 11.2. The power of having this information in
SysML is that the detailed information is retained and available for future
projects. By scoping diagrams to packages (akin to organizing files into fold-
ers in Explorer) it is possible to signify which information forms part of the
current conceptual model and which falls outside the scope of the current
project. The issue with this approach is that if the SysML model of a produc-
tion plant for instance is to be maintained as a central information reposi-
tory that can be drawn upon for future simulation projects then there will
be various conceptual models with overlapping information and alterna-
tive perspectives on the system and organizing packages in this manner
will quickly become complex. The distinguishing difference between using
SysML for conceptual modeling and for systems engineering is that con-
ceptual modeling requires both an understanding of the real system and
an understanding of the simulated system (with the assumptions and sim-
plifications that make it different from the real system) whereas systems
engineering deals only with the real system. Conceptual modeling there-
fore has additional perspectives and interpretations to deal with that make
OSD purchase order
298
Customer Vendor :Web config :Accounts SW :Prod spec SW :Shop floor SW :SAP
ref (Activity) place purchase order
Configure product
Place purchase order
Request PO number
Translate to BOM
Pass BOM
Check current stock levels
Fig ur e 11.8
Sequence diagram of information flow in ordering process.
Conceptual Modeling for Discrete-Event Simulation
An Evaluation of SysML to Support Simulation Modeling 299
maintaining a SysML model more difficult. For example, the simple process
shown in Figure 11.8 does not indicate what might happen if stock levels
increased (perhaps due to components being reclaimed from scraped prod-
ucts) after a purchase order is generated. While the real system occasionally
experiences this problem the model did not attempt to include this complex-
ity. This and other issues are discussed in Section 11.4.
The original document that this SysML model was based on was in itself a
useful resource for the company as it provided an end-to-end description of
the process and it uncovered a number of interactions between processing
areas that affected efficiency. One simple example of this was the realiza-
tion that when certain data queries were ran in one section of the produc-
tion facility it delayed the production order release process at the beginning
of the line as it slowed down the process of identifying orders for which
all components were available. This beneficial effect is widely reported in
simulation studies, for instance Robinson 2004 suggests that possibly 50% of
the benefit is obtained just from the development of the conceptual model;
“The modeller needs to develop a thorough understanding of the operations
system in order to design an appropriate model. In doing so, he/she asks
questions and seeks for information that often have not previously been con-
sidered. In this case, the requirement to design a simulation model becomes
a framework for system investigation that is extremely useful in its own
right.” Shannon (1975) even suggests that in some cases the development
of a conceptual model may lead to the identification of a suitable solution
and eliminate the need for further simulation analysis. Considering that
some SysML tools allow for the diagrams to be “stepped through” (as dis-
cussed in Section 11.2.4), the use of SysML provides an even greater chance
of resolving issues during the conceptual modeling phase as inconsistent
information will be highlighted and cause and effect relationships can be
explored. This SysML feature could have benefits for the validation of con-
ceptual models and ensuring that a correct understanding of the system has
been achieved.
To be successfully utilized in conceptual modeling, SysML needs to be
compatible with the frameworks for conceptual modeling as discussed in
section 11.1. Taking the most recent framework, Robinson 2008, the first step
of understanding the problem situation can occur much more quickly if a SysML
model of the system under investigation (SUI) already exists. Even if one
does not exist, the process of developing a model would provide a struc-
tured means of gaining useful insight into the situation. The second step of
determining the modeling and general project objectives is not directly solved by
the use of SysML. This is an important step as it determines the direction
of the simulation study and simply having a SysML model will not ensure
that the correct model objectives have been identified. Techniques like the
soft systems methodology (SSM) as discussed by Kotiadis 2006 and Kotiadis
2007 can be used to elicit the objectives from stakeholders. The purposeful
activity models (PAMs) generated during SSM can be represented within a
300 Conceptual Modeling for Discrete-Event Simulation
SysML model using activity diagrams and the determined objectives can
be recorded in SysML requirements diagrams. This would support central
retention of information and would allow SysML relationships includ-
ing “trace” and “satisfy” to be used to identify how model objectives are
addressed in the simulation. As a simple example, an activity that records the
number of products exiting a particular processing station could be traced
to an objective to determine system throughput thus explaining why this
activity is required in the simulation. It would be advantageous if SysML
tools were able to highlight clearly any model elements that did not trace
back to a requirement or similarly requirements that were not satisfied in the
model. This did not appear to be possible in any of the SysML tools reviewed
to date. The next steps of identifying the model outputs and identify the model
inputs are readily supported by SysML object parameters and parametric
diagrams. The final step of determining the model content, while identifying
any assumptions and simplifications, can be successfully recorded in a struc-
tured manner in SysML as discussed earlier in regard to the kitting proc-
ess example. By having a formal graphical model of the SUI, it is suggested
that the difficult task of deciding on which assumptions and simplifications
to make will be eased with natural selections and associated implications
becoming more clearly recognisable.
On review of existing research in the area and the experiences gained while
using the language, it is proposed that there is potential for using SysML as a
common thread that could underlie all the activities undertaken in a simula-
tion study from the initial requirements gathering phase through defining
the conceptual model and on to the development of the simulation model
(see Figure 11.9). Once information has been captured during one activity it
would be available in a useable format for the next.
In this section it has been shown that there is merit in using SysML in the
conceptual modeling process. It is capable of representing the type of infor-
mation typically handled in this simulation phase such as information and
material flows and moreover it brings structure and standardization that can
greatly help knowledge transfer and reuse. There are nonetheless a number
of challenges for the adoption of SysML as the standard conceptual mod-
eling format. These are discussed in the next section.
OMG SysMLTM
Fig ur e 11.9
SysML: A common foundation.
An Evaluation of SysML to Support Simulation Modeling 301
11.5 Conclusions
SysML is an accepted standard with a growing user base. The UML herit-
age and OMG adoption of the standard reflect the level of sophistication in
the language while the development effort invested by practitioners and tool
vendors alike shows the level of interest in the standard from both sides of
the market.
SysML provides a standard that has potential to be used in DES. Such a
standard would provide great benefit as it would provide a common lan-
guage, which has been noted to be lacking in this domain.
All of the advantages that have been put forward supporting the use of
UML for conceptual modeling still stand and indeed most are strengthened
by the fact that SysML has a much broader scope than the software-specific
304 Conceptual Modeling for Discrete-Event Simulation
References
Al-Ahmari, A.M.A., and K. Ridgway. 1999. An integrated modelling method to sup-
port manufacturing systems analysis and design. Computers in industry, 38(3),
225–238.
Alexander, D., S. Sadeghian, T. Saltysiak, and S. Sekhavat. 2007. Quicklook Final Report.
Fairfax, Virginia: George Mason University.
Arthur, J. D., and R. E. Nance. 2007. Investigating the use of software require-
ments engineering techniques in simulation modelling. Journal of simulation
1:159–174.
An Evaluation of SysML to Support Simulation Modeling 305
Balduzzi, F., A. Giua, and C. Seatzu. 2001. Modelling and simulation of manufacturing
systems with first order hybrid Petri Nets. International journal of production
research 39(2):255–282.
Barjis, J., and B. Shishkov. 2001. UML based business systems modeling and simula-
tion. In Proceedings of 4th International Eurosim Congress, Delft, The Netherlands,
June 26–29.
Birta, L. G., and G. Arbez. 2007. Modelling and Simulation: Exploring Dynamic System
Behaviour. London: Springer-Verlag.
Booch, G. 1999. UML in action: Introduction. Communications of the ACM
42(10):26–28.
Evans, J. B. 1988. Structures of Discrete Event Simulation: An Introduction to the
Engagement Strategy. Chichester: Ellis Horwood.
Friedenthal, S., A. Moore, and R. Steiner. 2008a. OMG Systems Modeling Language
Tutorial (revision b), INCOSE. http://www.omgsysml.org/ (accessed 4 Feb.
2009).
Friedenthal, S., A. Moore, and R. Steiner. 2008b. A practical guide to SysML: The Systems
Modeling Language. Burlington, MA: Morgan Kaufmann Publishers.
Herzog, E., and A. Pandikow. 2005. SysML: An Assessment. In Proceedings of 15th
INCOSE International Symposium, Rochester, NY, July 10–15.
Holt, J., and S. Perry. 2008. SysML for Systems Engineering. London: The IET.
Hu, Z., and S. M. Shatz. 2004. Mapping UML diagrams to a Petri Net notation for sys-
tem simulation. In Proceedings of International Conference on Software Engineering
and Knowledge Engineering (SEKE), Banff, Alberta, Canada, ed. F. Maurer and
G. Ruhe, June 20–24.
Huang, E., K. S. Kwon, and L. McGinnis. 2008. Toward on-demand wafer fab simu-
lation using formal structure and behavior models. In Proceedings of the 2008
Winter Simulation Conference, Miami, FL, ed. S. J. Mason, R. R. Hill, L. Mönch,
O. Rose, T. Jefferson, and J. W. Fowler, 2341–2349. Piscataway, NJ: IEEE.
Huang, E., R. Ramamurthy, and L. F. McGinnis. 2007. System and simulation mod-
eling using SysML. In Proceedings of the 2007 Winter Simulation Conference,
Washington DC, ed. S. G. Henderson, B. Biller, M.-H. Hsieh, J. Shortle, J. D. Tew,
and R. R. Barton, 796–803. Piscataway, NJ: IEEE.
Jeong, K. Y. 2000. Conceptual frame for development of optimized simulation-based
scheduling systems. Expert systems with applications 18:299–306.
Johnson, T. A., C. J. J. Paredis, and R. M. Burkhart. 2008. Integrating Models and
Simulations of Continuous Dynamics into SysML. In Proceedings of the 6th
International Modelica Conference, Bielefeld, Germany, 135–145. Modelica
Association.
Kobryn, C. 1999. UML 2001: A standarization odyssey. Communications of the ACM
42(10):29–37.
Koriem, S. M. 2000. A fuzzy Petri Net tool for modeling and verification of knowl-
edge-based systems. The computer journal 43(3):206–223.
Kotiadis, K. 2006. Extracting a conceptual model for a complex integrated system in
health care. In Proceedings of the Operational Research Society Simulation Workshop
2006 (SW06), Birmingham, UK, ed. S. B. J. Garnett, S. Robinson and S. Taylor,
235–245. Operational Research Society.
Kotiadis, K. 2007. Using soft systems methodology to determine the simulation study
objectives. Journal of simulation 1(3):215–222.
306 Conceptual Modeling for Discrete-Event Simulation
Robinson, S. 2004. Simulation: The Practice of Model Development and Use. Chichester:
Wiley.
Robinson, S. 2008. Conceptual modelling for simulation part II: A framework for
conceptual modelling. Journal of the operational research society 59:291–304.
Rosenblit, J., J. Hu, T. G. Kim, and B. P. Zeigler. 1990. Knowledge Based Design and
Simulation Environment (KBDSE): Foundation concepts and implementation.
Journal of operational research society 41(6):475–489.
Ryan, J., and C. Heavey. 2007. Development of a process modelling tool for simula-
tion. Journal of simulation 1(3):203–213.
Schürr, A. 1994. Specification of graph translators with triple graph grammars. In
Proceedings of the 20th International Workshop on Graph-Theoretic Concepts in
Computer Science, 151–163. Springer-Verlag.
Shannon, R. E. 1975. Systems Simulation: The Art and Science. Englewood Cliffs, NJ:
Prentice-Hall.
Shi, J. 1997. A conceptual activity cycle-based simulation modeling method. In
Proceedings of Winter Simulation Conference, Atlanta, GA, ed. A. Andradottir,
K. J. Healy, D. H. Withers, and B. L. Nelson, 1127–1133. Piscataway, NJ: IEEE.
Shih, M. H., and H. K. C. Leung. 1997. Management Petri Net: A modelling tool for man-
agement systems. International journal of production research 35(6):1665–1680.
Thomasma, T., and O. M. Ulgen. 1988. Hierarchical, modular simulation modelling
in icon-based simulation program generators for manufacturing. In Proceedings
of Winter Simulation Conference, San Diego, CA, Dec 12–14, ed. P. L. Haigh,
J. C. Comfort, and M. A. Abrams, 254–262. Piscataway, NJ: IEEE.
van der Zee, D. J., and J. G. A. J. van der Vorst. 2005. A modeling framework for
supply chain simulation: Opportunities for improved decision making. Decision
sciences 36:65–95.
Van Rensburg, A., and N. Zwemstra. 1995. Implementing IDEF techniques as simula-
tion modelling specifications. Computer industrial engineering 29:467–471.
Vojnar, T. 1997. Various kinds of Petri Nets in simulation and modelling. In Proceedings
of Conference on Modelling and Simulation of Systems MOSIS’97, Ostrava, Czech
Republic, 227–232.
Weilkiens, T. 2006. Systems Engineering with SysML/UML. Burlington, MA: The MK/
OMG Press.
Willard, B. 2007. UML for systems engineering. Computer standards & interfaces
29:69–81.
12
Development of a Process
Modeling Tool for Simulation
Contents
12.1 Introduction...............................................................................................309
12.2 Overview of Process Modeling Methods.............................................. 310
12.3 Simulation Activity Diagrams (SAD)..................................................... 315
12.3.1 SAD Action List........................................................................... 315
12.3.2 SAD Modeling Primitives.......................................................... 316
12.3.3 SAD Model Structure................................................................. 319
12.3.4 Elaboration of SAD Models....................................................... 321
12.4 Evaluation of SAD: Case Study............................................................... 321
12.4.1 System Description..................................................................... 322
12.4.2 SAD Model................................................................................... 324
12.4.3 IDEF3 Model................................................................................ 329
12.4.4 Differentiation of the SAD Technique from
Currently Available Techniques............................................... 331
12.4.5 Discussion.................................................................................... 332
12.5 Conclusions................................................................................................334
Acknowledgments...............................................................................................334
References.............................................................................................................. 335
12.1 Introduction
Conceptual modeling or the precoding phases of any simulation project are
crucial to the success of such a project (Wang and Brooks 2006). The problem
definition, requirements gathering, and conceptual model formulation pro-
cess is often a time-consuming one, as is the process of collecting detailed
information on the operation of a system (Balci 1986). However, little sub-
stantive research on the subject has been reported in the literature (Brooks
2006, Robinson 2004).
Hollocks (2001) recognized that such premodeling and postexperimenta-
tion phases of a simulation project together represent as much or more effort
than the modeling section of such projects and that software support for
309
310 Conceptual Modeling for Discrete-Event Simulation
1995, Scheer 1998, and INCOME Process Designer 2003), and a number have
been used to support simulation (i.e., Van Rensburg and Zwemstra 1995 and
Al-Ahmari and Ridgway 1999). To ascertain the level of support given by
current process modeling tools a selective review of a number of methods/
tools was carried out (Ryan and Heavey 2006). The criteria used to conduct
this review were as follows:
The review focused on methods/tools that have been used to support simu-
lation and/or exhibit characteristics desirable in a dedicated process mod-
eling tool for simulation. The methods/tools were categorized into these
methods:
Formal Methods: These are methods that have a formal basis and there
are numerous software implementations of these methods. Methods
reviewed under this category were: (i) Petri Nets (Ratzer et al.
2003); (ii) Discrete Event System Specification (DEVS) (Zeigler 1984);
(iii) Activity Cycle Diagrams (ACD) (Tocher 1963), and (iv) Event
Driven Process Chains (EDPC) (Tardieu et al. 1983).
Descriptive Methods: These methods have little formal basis and are
primarily software implementations. Methods reviewed here were:
(i) IDEF (NIST 1993); (ii) Integrated Enterprise Modeling (IEM)
(Mertins et al. 1997); (iii) Role Activity Diagrams (RAD) (Ould 1995);
(iv) GRAI Method (Doumeingts 1985), and (v) UML State Charts and
Activity Diagrams (Muller 1997).
In summary, this review concluded that Petri Nets are to a certain extent
capable of visually representing and communicating discrete-event-system
logic, however such Petri Net models are not capable of visually accounting
for complex branching logic or hierarchically decomposing complex models
into submodels and as a result become very cumbersome as system complex-
ity increases. The technique also does not account for a user’s viewpoint,
resources, information flows, or a means of elaborating the graphical model
in a textual manner. However the technique is capable of accurately repre-
senting state transitions and the activities associated with the execution of
such flows.
The DEVS formalism is capable of accurately representing the various
changes in state of a discrete-event system along with being somewhat capa-
ble of representing resources, activities, and branching within its mathemati-
cal representation. However, the formalism is not visual in nature and does
not account for the user’s interactions with the system, information flows, or
a user-friendly elaboration language.
ACDs are again somewhat capable of visually representing and communi-
cating certain discrete-event–system logic. It achieves this by means of mod-
eling state transitions and the activities that cause such state transitions to
be executed. However, the technique fails to account for a user’s perspective,
Development of a Process Modeling Tool for Simulation 313
Activity 1
Event 1 Event 2
Fig ur e 12.1
SAD actions.
are executed in a time ordered sequence from top to bottom and from left to
right ensuring that each criterion is satisfied. Only when each action has been
executed, can the full activity be executed and the system transition success-
fully to state 2. Taking this approach a SAD becomes a graphical representa-
tion of the various events in a simulation model. Each event is represented
in a SAD by an activity. This activity is then further graphically represented
by an action list. This will be further developed in the following section by
the introduction of a series of modeling primitives that may be used in the
detailing of such an activity.
Therefore a fan out, “AND” branch in a model means that when the execu-
tion of the model reaches that point in the process represented by such a
318 Conceptual Modeling for Discrete-Event Simulation
B A
A AND AND C
C B
Fig ur e 12.2
AND branches.
branch, all the elements that are immediate successors of the branch will
be executed. If a synchronous, “AND(S)” branch is used then the execution
of that branch will mean that all of the immediate successor elements must
begin execution simultaneously.
Similarly in a model where a fan in, “AND,” branch is executed all ele-
ments that immediately precede that branch will have been executed. If a
synchronous, “AND(S),” branch is used, then, for that part of the model to
execute all the elements preceding must all end simultaneously. Thus, an
execution of the left-hand model in Figure 12.2 will consist of the execution
of element, A, followed by elements B and C. Similarly the execution of the
right-hand model in Figure 12.2 will result in the execution of element, C,
preceded by the execution of elements A and B; if a synchronous, “AND(S),”
branch is used, then for there to be an execution of the element, C, both ele-
ments, A and B must end simultaneously. For example the left-hand model
of Figure 12.2 could represent a disassembly operation, element A could be
broken down into two constituent parts, B and C. Similarly the right-hand
model could represent an assembly where elements A and B are combined
to create element C.
A fan out inclusive, “OR,” branch in a model indicates that, in an execu-
tion of that branch there will be an execution of at least one of the elements
connected to the branch to the right. Similarly, a fan out exclusive, “XOR”
branch in a model indicates that, in an execution of that branch, there will
be an instance of exactly one of the elements connected to the branch to the
right, for example an element will either pass or fail inspection, it cannot do
both. If a synchronous inclusive, “OR(S)” branch is used, then all elements
that are executed must start simultaneously. This does not apply to exclu-
sive, “XOR” branches, since there can only be one element executed in an
“XOR” execution. Similarly with fan in inclusive “OR” branch, there will be
at least one element executed to the left of the branch. If a synchronous inclu-
sive “OR(S)” branch is used, then, those elements that are executed, if there
is more than one, must all end simultaneously. Hence, an execution of the
model to the left in Figure 12.3 consists of an instance of the element A pro-
ceeded by an instance of either B or C, or both. If the models in Figure 12.3
used “XOR” branches, then an execution of the first model could not include
an instance in which the execution of both B and C occur while an execution
Development of a Process Modeling Tool for Simulation 319
B A
A OR OR C
C B
Fig ur e 12.3
OR branches.
Entity link
Activity link
Information link
Fig ur e 12.4
SAD link types.
Link Types: Links are the glue that connects the various elements of a
SAD model together to form complete processes. Within the SAD
technique there are three link types introduced known as entity
links, information links, and activity links. Arrows on each link
denote the direction of the flow of each representative link. The sym-
bols that represent each type are shown in Figure 12.4.
SAD Frame Element: The SAD frame element provides a mechanism for
the hierarchical structuring of detailed interactions within a dis-
crete-event system into their component elements, while also show-
ing how such elements interact within the overall discrete-event
system.
Fig ur e 12.5
A simple SAD example.
Table 12.1
Structured Language
Keyword Description
USES The supporter resource may at times make use of auxiliary
resources to execute an action or actions, in other words a
supporter USES auxiliary resources
TO Details the action or actions that are executed by use of an
auxiliary resource by a supporter resource
AT Specifies the Locations where the action or actions are
executed
TRANSITIONS Specifies the change of state of entity or information from
TO one state to another
A series of these actions and the associated interactions with other SAD
modeling elements make up an action list. A series of these activities in turn
make up a sequence of transition for physical or information entity.
facilitate comparison the SAD description is first given and then is followed
by an IDEF3 model. The case study ends with a discussion comparing SAD
with IDEF3 and other modeling approaches currently available.
Tiers
Tray
Honeycomb
Tier
Fig ur e 12.6
Schematic of a carburising jig.
There are two furnace operators who are required to carry out the following
prioritized operations:
furnace next becomes empty. Finally, after the jig containing the rods is car-
burised, it must be transferred immediately to the cooling tower to be cooled
under controlled conditions to ensure the required hardness is achieved by
the carburising process. After the cooling tower the operators allow the jig to
air cool until the rods are cool enough to be unloaded. The unloading opera-
tion is a manual operation, where the parts are unloaded and passed to the
next work region.
Table 12.2
Elaboration of the SAD Model
Elaboration of the Activity
Operator 1
OR
Operator 2
EITHER
The operations are outlined here in the sequence of execution to produce a part, however
priority rules apply to the sequence of operations within the area and these priority rules are
contained in an attached document (Furnace-operation- priorities.doc)
TO
EITHER
Build a tray
There are four types of tray the details of which are contained in the attached document
(tray-types.xls)
OR
Build a tier
A tier consists of 6 trays
OR
Build a jig
A jig is made up of a maximum of four tiers and each tier is made up of a number of trays. The
number of tiers and trays used and the number of parts is dependant on the size and weight
of parts with maximum limits on each. The details for this are contained within the following
attached documents.
(Max-Furnace-utilisation.xls)
(Round-rod-weights.xls)
(Hex-Rod-weights.xls)
While fully built jigs are preferred, parts in the holding section for longer than 8 hours may be
used on partially built jigs.
(Continued)
328 Conceptual Modeling for Discrete-Event Simulation
OR
Unload jig
AT
Furnace
AND
Load jig
AND
EITHER
Cool
OR
Temper
AT
Cooling tower
OR
Unload jig
AT
Cooling tower
AND
Move jig to holding area
AT
Jig holding area
Development of a Process Modeling Tool for Simulation 329
12.4.3 IDEF3 Model
The IDEF3 Process Description Method provides a mechanism for collect-
ing and documenting processes. IDEF3 captures precedence and causality
relations between situations and events in a form natural to domain experts,
by providing a structured method for expressing knowledge about how a
system, process, or organization works. The resulting IDEF3 descriptions
provide a structured knowledge base for constructing analytical and design
models. These descriptions capture information about what a system actu-
ally does or will do, and also provide for the organization and expression of
different user views of the system.
There are two IDEF3 description modes, process flow and object state tran-
sition network. A process flow description captures knowledge of how things
work in an organization, e.g., the description of what happens to a part as it
flows through a sequence of manufacturing processes. The object state tran-
sition network description summarizes the allowable transitions an object
may undergo throughout a particular process. Both the process flow descrip-
tion and object state transition description contain units of information that
make up the system description. In Figure 12.9 the IDEF3 model for the car-
burising area is shown. At the highest level in this model the carburise area
is represented by a unit of behavior (UOB) named “CARB1 Carburise.” UOBs
can be used to represent a system, subsystem, or individual tasks within a
model depending on the context and level at which they are used.
330
Carb 1
carburise
2.3
2.3.3.1 2.3.4.1
Carburising
setting 2
Temper
2.3.3.2
2.3.4.2
Carburusing
setting 3
2.3.3.3
Carburusing
setting 4
2.3.3.4
Fig ur e12.9
IDEF3 model of Work Region 2.
Conceptual Modeling for Discrete-Event Simulation
Development of a Process Modeling Tool for Simulation 331
While such similarities exist within the SAD technique, the overall model-
ing approach is radically different. The SAD technique endeavors to model
complex interactions such as those that take place within an actual detailed
simulation model of a real system. Again the SAD technique is designed to
fulfill the design objectives outlined in Section 1 of this chapter. Each of these
requirements are represented within the SAD technique. Both the physical
and informational flows within a discrete-event system are modeled at either
extremity of a SAD model as shown in Figure 12.5. Also modeled are the
resources used in the execution of the various activities associated with the
transitioning of both the physical and informational models through their
various discrete states, again represented in Figure 12.5. In achieving these
goals, the technique uses the various SAD modeling primitives to represent
the various events that are listed in a simulation event list. To also represent
more complex interactions, the SAD technique introduces the concept of an
action list, which is used to represent detailed actions that collectively can
make up any event within a simulation event list. Such a modeling approach
allows for the modeling of a modern discrete-event system and in turn a
simulation model of the same. Finally the use of a structured text-based
elaboration within the SAD technique allows for the removal of any ambi-
guities that may arise within a complex model. Such an approach increases
the user’s access to the information and knowledge that would otherwise be
lost in detailed simulation code. As a result of these modeling approaches
the SAD technique uses a set of high-level modeling primitives that are capa-
ble of representing complex discrete-event systems. The modeling technique
places a low modeling burden on the model developer while also promoting
the capture, representation and communication of detailed information in a
user-friendly manner for models users.
12.4.5 Discussion
The SAD technique while not yet supplying a full and definitive support tool
for the requirements gathering phases of a simulation project does it is felt go
some way toward acting as an initial solution space.
In its current guise the SAD technique endeavors to model complex inter-
actions such as those that take place within an actual detailed simulation
model of a real system. To achieve this the modeling method uses the vari-
ous SAD modeling primitives to represent the events in a simulation model.
To also represent more complex interactions the SAD method introduces the
concept of an action list, which is used to represent detailed actions that
Development of a Process Modeling Tool for Simulation 333
collectively can make up any event within a simulation model. The SAD
technique also allows for the modeling of both a physical and informational
system that may make up a discrete-event system along with interactions
between both (Ryan and Heavey 2006).
Each SAD diagram starts from the actor/supporter resources section, in
this way each SAD is developed and executed from the perspective of those
using the system or interacting with it. In this case either one or the other or
both (the “OR” branch) of the operators can do either (the “XOR” branch) the
“rope & stamp parts” action or else they can either with or without a “Crane”
supporter auxiliary resource (the next “OR” branch) can carry out any of each
of the following individual actions, individually ( the “XOR” branch), “Build
a tray,” “Build a tier,” “Build a jig,” “Move to jig waiting area,” “Collect jig,”
“Load jig,” ‘Carburise,” “Unload jig,” “Load jig,” “Cool,” “Temper,” “Unload
jig,” “Move jig to holding area,” “Dismantle jig,” denoted by the yellow
actions. Some of these actions can be done either separately or in conjunction
with each other. For example either build a tray or a tier or an entire jig or
possibly all of these or any combinations, denoted by the “OR” branch above
the ‘Build a tray,” “Build a tier,” “Build a jig” actions. Also a number of tasks
are always carried out in sequence with each other, for example “Build a jig”
and “Move jig to waiting area” are both carried out in sequence, denoted by
the “AND” branch above these tasks. A number of “AND” branches are also
located between the actions and the various queues and primary resources,
these branches are used to indicate where each of the individual actions are
executed, for example the actions “Rope & stamp parts,” “Build a tray,” “Build
a tier,” “Build a jig” are executed at the “Jig holding area” queue element in
this SAD model. The physical system (located at the lower region of the SAD
model) shows the change of state of the parts (entities) within the system hav-
ing passed fully through the Furnace area, the parts denoted by the green
entity state objects change from a “Pre anneal part” to an “Annealed part.”
The SAD technique is not a definitive solution and currently needs fur-
ther refinement, validation, and development. A number of issues are still
in need of addressing. These include the incorporation of multiple model-
ing views, this would allow a model developer to initially model the system
requirements “as is” model and from this develop a second system view or
conceptual model. The facilitation of a process whereby both models could
be developed in the same format and viewed simultaneously would it is felt
further enhance communication and understanding. The implementation of
a step through facility would also it is felt be advantageous. It is also felt that
there is a need for the development of further techniques to support a simu-
lation model developer in these precoding phases of a simulation project. It is
hoped that further research will be carried out in this area with a view to the
development of such techniques. The advantages that such techniques may
offer while being difficult to accurately predict may include a number of the
following. The development of detailed, valid and visual process models of
complex discrete-event systems prior to the coding of simulation models may
334 Conceptual Modeling for Discrete-Event Simulation
12.5 Conclusions
The requirements gathering phase of a simulation project is important in
relation to the overall success of a simulation project. This chapter high-
lights the fact that there is inadequate support currently available for this
task. While numerous process modeling techniques are available and sev-
eral have been used to support the requirements gathering of a simulation
project, the chapter argues that the techniques available do not provide
adequate support. The chapter presents an overview of a process model-
ing technique, SAD, developed to endeavor to overcome some of the cur-
rent shortfalls highlighted. The SAD technique endeavors to model complex
interactions such as those that take place within an actual detailed simula-
tion model of a real system. To achieve this the modeling method uses the
various SAD modeling primitives to represent the events in a simulation
model. The SAD method has been evaluated on five case studies. The par-
tial results of one case study (a batch flow line) was presented and using
this case study a comparison with IDEF3 was made. It is important to note
that SAD is not being presented as a “final” solution but results of work-in-
progress research.
Acknowledgments
The authors wish to thank the following for permission to reproduce
copyright material:
Development of a Process Modeling Tool for Simulation 335
Ryan, J., and C. Heavey. 2007. Development of a process modeling tool for
simulation. Journal of Simulation 1(3): 203–213. Reproduced with permission
of Palgrave Macmillan.
References
Al-Ahmari, A. M. A. and K. Ridgway, 1999. An integrated modelling method to
support manufacturing systems analysis and design, Computers in industry, 38:
225–238.
Balci, O. 1986. Credibility assessment of simulation results In J. Henriksen, S. Roberts,
and J. Wilson (eds), Proceedings of the 1986 Winter Simulation Conference, Piscataway,
NJ, New York: Association for computing machinery (ACM), 38–43.
Brooks, R. (2006), Some thoughts on conceptual modelling, performance, complexity
and simplification. In J.Garnett, S. Brailsford, S. Robinson and S. Taylor (eds),
Proceedings of the 2006 OR Society Simulation Workshop (SW06), Lemmington-Spa,
UK, Birmingham :The Operational Research Society, 221–226.
Brooks, R., and Wang, W. 2006. Improving the understanding of conceptual modelling
In J.Garnett, S. Brailsford, S. Robinson and S. Taylor (eds), Proceedings of the 2006
OR Society Simulation Workshop (SW06), Lemmington-Spa, UK, Birmingham:
The Operational Research Society, 227–234.
Conwell, C. L., R. Enright, and Stutzman, M.A. 2000. Capability maturity models
support of modelling and simulation verification, validation, and accreditation
In R. R. Barton, P. A. Fishwick, J. A. Joines and K. Kang (eds), Proceedings of
the 2000 Winter Simulation Conference, Orlando, Florida: IEEE Computer Society,
819–828.
Doumeingts, G. 1985. How to decentralize decisions through GRAI model in produc-
tion management, Computers in industry, 6(6): 501–514.
Hollocks, B. W. 2001., Discrete event simulation: an inquiry into user practice,
Simulation practice and theory, 8(67): 451–471.
INCOME Process Designer. 2003. URL: http://www.get-process.com/. Last accessed
6/12/2003.
Jeong, K.-Y. 2000., Conceptual frame for development of optimized simulation-based
scheduling systems, Expert systems with applications, 18(4): 299–306.
Kettinger, W. J., J. T. C. Teng, and S. Guha, S. (1997), Business process change: A study
of methodologies, techniques, and tools, MIS quarterly, 21(1): 55–80.
Mayer, R. J., C. P. Menzel, P. S. deWitte, T. Blinn, and B. Perakath, B. 1995. Information
Integration for Concurrent Engineering (IICE) IDEF3 Process Description Capture
Method Report, Technical report, Knowledge Based Systems Incorporated
(KBSI).
Mertins, K., R. Jochem, and F. W. Jakel. 1997. A tool for object-oriented modelling and
analysis of business processes, Computers in industry, 33: 345–356.
Muller, P. A. 1997. Instant UML. Birmingham, UK :Wrox Press.
Nethe, A., Stahlmann, H.D. 1999. Survey of a general theory of process modelling,
In Scholz-Reiter, B., Stahlmann, H.D., Nethe, A. (eds), Process Modelling, Berlin:
Springer-Verlag, 2–17.
336 Conceptual Modeling for Discrete-Event Simulation
NIST. 1993. Integration Definition for Function Modelling (IDEF0). Technical Report FIPS
183, National Institute of Standards and Technology.
Ould, M. A. 1995., Business Processes: Modeling and Analysis for the Reengineering and
Improvement. Chichester, UK: Wiley.
Perera, T., and K. Liyanage, (2000). Methodology for rapid identification and collec-
tion of input data in the simulator of manufacturing systems, Simulation practice
and theory, 7: 645–656.
Ratzer, A. V., L. Wells, H. M. Lassen, M. Laursen, J. F. Qvortrup, M. S. Stissing, M.
Westergaard, S. Christensen, and K. Jensen, 2003. CPN Tools for editing, simu-
lating, and analysing coloured Petri Nets. In W. van der Aalst and E. Best (eds)
Applications and Theory of Petri Nets 2003: 24th International Conference, ICATPN
2003, Heidelberg, Eindhoven, The Netherlands: Springer-Verlag, 450–462.
Robinson, S. 2004. Simulation: The Practice of Model Development and Use. Chichester,
UK: John Wiley and Sons.
Ryan, J., and Heavey, C. 2006. Process modelling for simulation, Computers in industry,
57: 437–450.
Sargent, R. G. 1999. Validation and verification of simulation models, In G. W. Evans,
P. A. Farrington, H. B. Nembhard and D. T. Sturrock, (eds) Proceedings of the 1999
Winter Simulation Conference, Squaw Peak, Phoenix, Arizona: IEEE Computer
Society, 39–48.
Scheer, A. W. 1998. ARIS In P. Bemus, K. Mertins, and G. Schmidt(eds), Handbook on
Architectures of Information Systems, Berlin: Springer-Verlag, 541–565.
Tardieu, H., A. Rochfeld, and R. Colletti. 1983. La methode MERISE, Principes et outils. /
The MERISE method, Principles and tools. Paris: Les editions d’organisation.
Tocher, K. D. 1963. The Art of Simulation. London: English Universities Press.
Van Rensburg, A., and N. Zwemstra, N. 1995. Implementing IDEF techniques as
simulation modelling specifications, Computers & industrial engineering, 29(1–4):
467–471.
Zeigler, B. P. 1984. Multifaceted Modelling and Discrete Event Simulation. London:
Academic Press.
13
Methods for Conceptual
Model Representation
Stephan Onggo
Contents
13.1 Introduction............................................................................................... 337
13.2 Textual Representation............................................................................340
13.3 Pictorial Representation........................................................................... 341
13.3.1 Activity Cycle Diagram (ACD)................................................. 341
13.3.2 Process Flow Diagram...............................................................343
13.3.3 Event Relationship Graphs (ERG).............................................344
13.4 Multifaceted Representation...................................................................345
13.4.1 UML and SysML.........................................................................346
13.4.2 Unified Conceptual Model........................................................ 347
13.4.2.1. Objectives Component.............................................. 347
13.4.2.2. Inputs and Outputs Component.............................. 349
13.4.2.3. Contents Component................................................. 350
13.4.2.4. Data Requirement Component................................ 351
13.4.2.5. Model-Dependent Component................................ 352
13.5 Summary.................................................................................................... 352
Acknowledgments............................................................................................... 353
References.............................................................................................................. 353
13.1 Introduction
Simulation conceptual model (or conceptual model, for brevity) represen-
tation is important in a simulation project because it is used as a tool for
communication about conceptual models between stakeholders (simulation
analysts, clients, and domain experts). There is a point in the simulation
project when the conceptual modeling process happens inside the individual
stakeholder’s mind. This “thinking” process includes reflection on how to
structure the problem and how the simulation model should be designed to
help decision makers solve the problem at hand, subject to certain constraints.
At some point in the simulation project, the conceptual model needs to be
communicated to other stakeholders. Hence, the role of conceptual model
337
338 Conceptual Modeling for Discrete-Event Simulation
* Simulation analysts often deal with clients and domain experts who have little knowledge
about simulation.
Methods for Conceptual Model Representation 339
* This may not be true in a simulation project where the requirement dictates the use of a spe-
cific implementation-dependent model representation (for reasons such as the familiarity to
the simulation software). See Chapter 1 for the discussion on the importance of the software
independency.
340 Conceptual Modeling for Discrete-Event Simulation
Somewhere Arrival
Tests D, N N
L In Triage
Wait for
resuscitation resc.
Wait for room
tests
D, N Wait for
In cubicle cub.
Fig ur e 13.1
A&E: Activity cycle diagram. (From Onggo, B.S.S., Journal of Simulation, 3 (1), 46, 2009. With
permission.)
Methods for Conceptual Model Representation 343
Ambulance
carrying Admitted? Discharged
patient No
arrives A&E
Walk-in
patient Yes
arrives In-patients
at A&E Yes
Referrals No
Outpatients
from GP
Admitted?
Fig ur e 13.2
Hospital: Business process diagram. (From Onggo, B.S.S., Journal of Simulation, 3 (1), 45, 2009.
With permission.)
of each element and other elements that are not mentioned here, such as pool
and lane.
Figure 13.2 shows the BPD of a typical hospital operation, which includes
three activities: Accident and Emergency (A&E), Outpatients, and In-patients.
Patients arrive in the system through A&E and Outpatients. The arrivals of
patients are events that start the processes. Depending on the condition, a
patient can be admitted to hospital (In-patient) or discharged. Discharge is
an event that terminates a process. If we want to add to the level of detail in
the model, we can move to a lower layer in the system hierarchy and treat
any of the activities as a process which can be decomposed further into a
number of activities.
ta
Start tf Finish {Qf--}
GP follow- follow-
{Q1++}
referral up up
(F and Qf C) app: appointment
(Q1 C) Not F F: needs follow-up
(F and Qf C)
t1 Q1: queue for first app.
Start 1st Finish Qf: queue follow-up
app. 1st app. Leave ta: time of next arrival
Not F outpatients t1: time for first app
{Q1--; tf: time for follow-up
if F then Qf++}
(Q1 C)
Fig ur e 13.3
Outpatients: Event relationship graphs.
1: Arrive 1: Arrive
2: Treat 2: Treat
4: Discharged 4: Discharged
(otherwise) (otherwise)
3: Discharged 3: Discharged
Fig ur e 13.4
A&E: Sequence diagram.
Table 13.1
Diagrams Used in the Unified Conceptual Model
Domain Component Representation
Problem Objectives Objective Diagram, Purposeful Activity Model
Inputs Influence Diagram
Outputs
Contents Business Process Diagram with textual representation
Data requirement Textual representation, Data dictionary
Model Discrete-Event Activity Cycle Diagram, Event Relationship Graph
System Dynamics Stock and Flow Diagram, Causal Loop Diagram
Agent-based Flowchart, Business Process Diagram, UML Activity
Diagram
Source: Adapted from Onggo, B.S.S., Journal of Simulation, 3 (1), 42, 2009. With permission.
Hospital performance
Target on patient Staff utilization Waiting time for Waiting time for
total time emergency admission elective admission
Fig ur e 13.5
A&E: Objective diagram. (From Onggo, B.S.S., Journal of Simulation, 3 (1), 43, 2009. With
permission.)
Performance
Fig ur e 13.6
A&E: Influence diagram. (From Onggo, B.S.S., Journal of Simulation, 3 (1), 44, 2009. With
permission.)
purposeful activity model (PAM). This work implies that PAM can be used
to complement the objective diagram to show the conditions under which
the objectives diagram is drawn.
The decision variables are the numbers of doctors, nurses and clerks. The
uncontrollable variables (shown as ovals) are the arrival rate and severity of
condition of the patients.
Ambulance Severe?
carrying
Resuscitation Admitted to
patient Triage
Yes room Hospital
arrives
Need test?
No
Yes Yes
Walk-in
Registration Admitted?
patient
Tests Reassessment
arrives
No
Yes
Need test?
No
Cubicles Discharged
No
Fig ur e 13.7
A&E: Business process diagram. (From Onggo, B.S.S., Journal of Simulation, 3 (1), 45, 2009. With
permission.)
Methods for Conceptual Model Representation 351
are two types of patient arrival: voluntary and by ambulance. A patient who
arrives voluntarily at the A&E department will need to register before being
evaluated by a nurse (triage) to determine the severity of the patient’s condi-
tion. One who arrives by ambulance may, however, bypass registration (the
triage is done on the way to the A&E department). Next, the patient will be
seen and treated by a doctor and/or nurse (either in the resuscitation room
or a cubicle). After treatment, patients will either be discharged or admitted
to the hospital. Some patients may need tests and X-rays, and these patients
then need a second session with a doctor and/or nurse before discharge or
admission.
BPD provides three artifacts that can be used to provide additional infor-
mation about an activity that is not directly related to the structure of the
process flow. One of them is text annotation, which is suitable for repre-
senting the assumptions and simplifications used in the conceptual model.
For example, in Figure 13.7, we can attach a text annotation to the activity
“triage” that provides a list of assumptions, such as “the severity of condi-
tion of patients is modeled as a simple random sampling.” Similarly, we can
attach a text annotation to the activity “test” that provides a list of simplifica-
tions such as “the service time for tests does not differentiate between the
types of test (X-ray, blood test, etc.).”
Table 13.2
Data Requirement for Entity Patient
Field Type Note
Patient details Name, address, patient This can be useful to identify patients and, if the
identifiers, etc. analysis requires it, profile patients.
Admission time Date/Time This is needed to determine the distribution of
admissions.
Severity level Minor or major This is needed to find the proportion of patients
needing minor treatment and major treatment.
Time in A&E Minutes This is needed to validate the output of the
model.
352 Conceptual Modeling for Discrete-Event Simulation
13.5 Summary
We have discussed three categories of methods for conceptual model repre-
sentation: textual representation, diagrams, and multifaceted representation.
Textual representation can be used to give a brief description of a model. This
is particularly useful when we have a repository of simulation models. The
description allows others to decide quickly whether a model is suitable, or
to search for the most suitable model to be used. The diagrams are effective
during conceptual model development. A multifaceted representation is the
best representation for the complete documentation of a conceptual model.
Multifaceted representation has another advantage. It allows us to verify the
Methods for Conceptual Model Representation 353
Acknowledgments
Some sections of this chapter are based on: Onggo, B. S. S. 2009. Toward
a Unified Conceptual Model Representation: A Case Study in Health Care.
Journal of Simulation 3 (1): 40–49. © 2009 Operational Research Society Ltd.
With permission of Palgrave Macmillan.
References
Araujo, W.L.F., and C.M. Hirata. 2004. Translating activity cycle diagrams to Java
simulation programs. In Proceedings of the 37th Annual Simulation Symposium,
157–164. Piscataway, NJ: IEEE Computer Society Press.
Forrester, J. 1961. Industrial Dynamics. Cambridge, MA: MIT Press.
Gunal, M.M., and M. Pidd. (2009). Understanding target-driven action in A&E perfor-
mance using simulation. Emergency Medicine Journal 26: 724–727.
Hills, P.R. 1971. HOCUS. Egham, Surrey, UK: P-E Group.
Howard, R.A., and J.E. Matheson. 1984. Influence diagram. In The Principles and
Applications of Decision Analysis, vol. II, ed. R.A. Howard and J.E. Matheson,
719–762. Palo Alto, CA: Strategic Decisions Group.
Huang, E., R. Ramamurthy, and L.F. McGinnis. 2007. System and simulation mod-
eling using SysML. In Proceedings of the 2007 Winter Simulation Conference, ed.
S.G. Henderson, B. Biller, M.-H. Hsieh, et al., 796–803. Piscataway, NJ: IEEE
Computer Society Press.
Keeney, R.L. 1992. Value-Focused Thinking. Cambridge, MA: Harvard University
Press.
Kotiadis, K. 2007. Using soft systems methodology to determine the simulation study
objectives. Journal of simulation 1 (3): 215–222.
Larkin, J.H., and H.A. Simon. 1987. Why a diagram is sometimes worth ten-thousand
words. Cognitive Science 11: 65–99.
354 Conceptual Modeling for Discrete-Event Simulation
Nance, R.E. 1994. The conical methodology and the evolution of simulation model
development. Annals of operations research 53: 1–45.
Onggo, B.S.S. 2009. Towards a unified conceptual model representation: A case study
in health care. Journal of simulation 3 (1): 40–49.
Pidd, M., and A. Carvalho. 2006. Simulation software: Not the same yesterday, today
or forever. Journal of simulation 1 (1):7–20.
Pooley, R.J. 1991. Towards a standard for hierarchical process oriented discrete
event Simulation diagrams. Transactions of the society for computer simulation 8
(1):1–20.
Richter, H., and L. Marz. 2000. Towards a standard process: The use of UML for design-
ing simulation models. In Proceedings of the 2000 Winter Simulation Conference, ed.
J.A. Joines, R.R. Barton, K. Kang, et al., 394–398. Piscataway, NJ: IEEE Computer
Society Press.
Robinson, S. 2002. Modes of simulation practice: Approaches to business and military
simulation. Simulation modelling practice and theory 10 (8):513–123.
Robinson, S. 2004. Simulation: The Practice of Model Development and Use. Chichester,
UK: Wiley.
Robinson, S. 2008. Conceptual modelling for simulation part I: Definition and require-
ments. Journal of the operational research society 59 (3): 278–290.
Robinson, S., and M. Pidd. 1998. Provider and customer expectations of successful
simulation projects. Journal of the operational research society 49 (3): 200–209.
Schruben, L. 1983. Simulation modeling with event graphs. Communications of the
ACM 26 (11): 957–963.
Sterman, J.D. 2004. Business Dynamics: Systems Thinking and Modeling for a Complex
World. New York: McGraw-Hill.
Vasilakis, C., D. Lecznarowicz, and C. Lee. 2009. Developing model requirements for
patient flow simulation studies using the Unified Modelling Language (UML).
Journal of simulation 3 (3): 141–149.
Wang, W., and R. Brooks. 2007. Empirical investigations of conceptual modeling and
the modeling process. In Proceedings of the 2007 Winter Simulation Conference,
ed. S.G. Henderson, B. Biller, M.-H. Hsieh, et al., 762–770. Piscataway, NJ: IEEE
Computer Society Press.
14
Conceptual Modeling for Composition
of Model-Based Complex Systems
Contents
14.1 Introduction............................................................................................... 355
14.2 Interoperability and Interoperation Challenges of Model-Based
Complex Systems...................................................................................... 358
14.2.1 Interoperability and Composability........................................ 358
14.2.2 Relevant Models Regarding Conceptual Modeling
for Compositions......................................................................... 360
14.2.2.1. The Semiotic Triangle................................................ 360
14.2.2.2. Machine-Based Understanding............................... 361
14.2.2.3. Levels of Conceptual Interoperability Model........ 362
14.3 Engineering Methods...............................................................................364
14.3.1 Data Engineering and Model-Based Data Engineering........364
14.3.3.1. Data Administration ................................................ 367
14.3.3.2. Data Management...................................................... 367
14.3.3.3. Data Alignment.......................................................... 369
14.3.3.4. Data Transformation.................................................. 370
14.3.2 Process Engineering .................................................................. 371
14.3.3 Constraint Engineering ............................................................ 372
14.4 Technical and Management Aspects of Ontological Means.............. 376
14.5 Conclusion................................................................................................. 378
References.............................................................................................................. 379
14.1 Introduction
Conceptual modeling is often understood as an effort that happens before
systems are built or software code is written and conceptual models are no
longer needed once the implementation has been accomplished. Conceptual
models are primarily described as mental models that are used in an early
stage in the abstraction or as a simplification process in the modeling phase.
This early stage of abstraction makes conceptual models difficult to verbalize
355
356 Conceptual Modeling for Discrete-Event Simulation
and formalize making them “more art than science,” as Robinson (Section
1.1) mentions, given the challenging tasks to define applicable methods and
procedures. This view is not sufficient for model-based applications. The goal
of conceptual modeling in Modeling and Simulation (M&S) is not focusing
on describing an abstract view of the implementation, but to capture a model
of the referent, which is the thing that is modeled, representing a sufficient
simplification for the purpose of a given study serving as a common conceptu-
alization of the referent and its context within the study.
In this sense, conceptual modeling in M&S is slightly different from its
traditional conception in which the focus is on capturing the requirements of
a system in order to replicate its behavior. The M&S view has the additional
requirement that the execution of the model will provide some additional
insight into some problem while the traditional view mainly focuses on sat-
isfying the identified requirements. In either view, the main challenge is to
identify what should be captured in the conceptual model in order to enable
users of the system to understand how the referent is captured.
In traditional conceptual modeling, this is less of a challenge because it is
somewhat easier to look at the behavior of a system and identify its coun-
terpart in the real world. The desired function can be captured in use cases
and serve for validation and verification. For example, it is obvious that an
Automatic Teller Machine (ATM) is representative of a real teller as it can per-
form many similar interactions including necessary inputs and outputs. In
M&S systems, interactions in terms of inputs and outputs are not sufficient
to identify a referent because many referents have similar inputs and outputs
when abstracted, which makes it impossible to identify which one the concep-
tualization is referring to. A conceptual model of a teller designed to study
the average processing time of a customer is different from an ATM. In this
case, the customers and the teller may be abstracted into probability density
functions and a queuing system, as this may be sufficient for the study. The
validity of answers is therefore highly dependent on the context of the model.
In other words, the conceptual model in M&S needs to capture data in the
form of inputs and outputs, processes that consume the data and needs a way
to distinguish conceptualizations of referents from one another by capturing
the assumptions and constraints inherent to the model. Conceptual models
must capture this information. The reason for this requirement is that this
information is needed to be able to decide if a given model can be used to solve
a problem, e.g., it is possible to reuse the model of the teller introduced above
to calculate the average processing time in systems similar to banks (like fast
food restaurants, or supermarkets). It could also be tailored to calculate the
average waiting time or average time in system for a customer. In addition,
given a modeling question, several models can be put together or composed to
provide an answer. However, information is needed that was captured by the
conceptual model in order to be able to decide if a model is applicable or not.
This observation becomes practically relevant when considering the use of
models as services that answer specific questions similar to any real-world
Conceptual Modeling for Composition of Model-Based Complex Systems 357
service (travel agency, bank) and orchestrate their execution to answer the
modeling question. To be able to do this models need to be composed and
orchestrated that can communicate with one another and provide the infor-
mation needed to make the decision whether a model is applicable in the
current application.
In general, orchestration, reuse, and composition are highly sought after
capabilities in M&S, but they are currently perceived to be costly, error-prone,
and difficult for many reasons, technical and nontechnical. The challenges
increase in service-oriented environments, where applicable services need to
be identified, where the best available solution in the context of the problem to
be solved needs to be selected, all required services need to be composed, and
their execution needs to be orchestrated. In traditional solutions, these tasks
are conducted by system engineers. The ultimate goal in service-oriented
environments is to have intelligent agents performing these tasks. In order to
attain this goal, agents need to have access to the same information and knowl-
edge used by system engineers. One of the first steps therefore should be to
provide conceptual models that are machine understandable or computable.
Yilmaz (2004) motivates this view in his research on defense simulation.
A formalization of conceptual modeling has direct implications within a
system of systems perspective as well. Within a pluralist context (Jackson
and Keys 1984), different viewpoints imply that questions about the referent
made by different people carry their own perceptions of the system with a
lack of a unifying consensus. These individual perceptions ultimately influ-
ence whether two systems will be composed or not. The resulting compo-
sition based on individual perceptions may be comprised of conceptually
misaligned models and produce inappropriate results. Informal mental
models allowing individual perceptions must therefore be replaced by for-
mal representations of the conceptualization.
In order to support the composition of model-based solutions in service-
oriented environments, assumptions, constraints, and simplifications need
to be explicitly presented. This needs to be done using metadata formally
representing the conceptualization. This metadata can be read by intelligent
agents and used to identify, select, compose, and orchestrate model-based
solutions as required before.
This chapter describes three engineering methods designed to capture
data, processes, the assumptions and constraints of a conceptual model for
model-based solutions, and shows how a computable conceptual model can
be used particularly in support of composition, reuse, and orchestration. The
focus of this chapter is on composability. The chapter is organized as follows:
Petty and Weisel (2003) discuss the differences and commonalities between
interoperability and composability and show that the definitions are primar-
ily driven by the challenges of technical integration and the interoperation
of implemented solutions versus the ability to combine models in a way that
is meaningful. Model-based complex systems further add a new category of
challenges to the already difficult problem of composability and interoper-
ability. A working definition for a model-based complex system can be derived
from the definition of the combined terms: a system is made up of several
components that interact with each other via interfaces; a complex system has
many components that interact via many interfaces that represent typically
nonlinear relations between the components; model-based systems use an
explicit formal specification of a conceptualization of an observed or assumed
reality. While complexity already plays a major role in the traditional view of
interoperability, the model-based aspect is not often considered. The work-
ing definition of interoperation used in this chapter is simply: two systems can
interoperate if they are able to work together to support a common objective.
To explicitly deal with challenges resulting from differences in con-
ceptualization, the term composability is used. As with interoperability,
the definitions used for the term composability are manifold. Petty et al.
(2003) compiled various definitions and used them to recommend a com-
mon definition embedded in a formal approach. Fishwick (2007) proposed,
in his recent analysis, the restriction of the scope of composability to the
model level, and following the recommendations of Page et al. (2004) distin-
guishes between three interrelated but individual concepts contributing to
interoperation:
The semantic rules of the component simulation tools and the seman-
tic intentions of the component designers are not advertised or in any
way accessible to other components in the federation. This makes it
difficult, even impossible, for a given simulation tool to determine the
semantic content of the other tools and databases in the federation,
termed the problem of semantic inaccessibility. This problem manifests
itself superficially in the forms of unresolved ambiguity and unidenti-
fied redundancy. But, these are just symptoms; the real problem is how
to determine the presence of ambiguity, redundancy, and their type in
the first place. That is, more generally, how is it possible to access the
semantics of simulation data across different contexts? How is it possible
to fix their semantics objectively in a way that permits the accurate inter-
pretation by agents outside the immediate context of this data? Without
this ability—semantic information flow and interoperability—an inte-
grated simulation is impossible.
In the remainder of this section, several relevant models that can be applied
to support the fulfillment of related requirements enabling semantic trans-
parency will be discussed.
Model
Concept
Implementation Conceptualization
Symbol Referent
Simulation System
Fig ur e 14.1
The semiotic triangle for M&S.
to explain why communication often fails. Referents are objects in the real
(or an assumed or virtual) world. When communicating about the referents,
perceptions or interpretations of these referents are used and captured in
concepts that reflect the user’s viewpoint of the world as object, etc., and then
symbols are used to talk about the user’s concepts.
Figure 14.1 shows the relation of this semiotic triangle to the M&S domain.
This model is similar to the one presented by Sargent (2001), where real-world
domain, conceptual model, and implemented model are distinguished, and
to the framework for M&S as recommended in Zeigler et. al., (2000), where
the experimental frame with the source model, the model, and the simulator
are distinguished.
It should be pointed out that the implementation does not reveal why
the conceptualization was chosen, only which one was chosen. A common
conceptualization can result in different implementations. In order to ensure
composability, conceptualization decisions need to be captured in addition
to the implementation decision.
Furthermore, model-based solutions can only use their models and sym-
bols, and no longer use the referent. The formal specification of the concep-
tualization is their view of the world. In order to decide if two model-based
solutions are composable a decision needs to be made whether a lossless
mediation between the conceptualization is possible (in the context of the
task to be supported by the composition).
Realizing the need to explicitly address the conceptual layer, Muguira and
Tolk (2003) published the first version of the LCIM, which was very datacen-
tric. The discussions initiated by the LCIM work, in particular the work of
Page et al. (2004) and Hofmann (2004), resulted in the currently used version,
which was first published by Turnitsa (2005). Figure 14.2 shows the evolution
of layered models of interoperation resulting in the LCIM.
The LCIM exposes six levels of interoperation, namely the following:
Conceptual
Harmonized
data
Dynamic
Less ← interoperation → more
Aligned
Pragmatic
dynamic data
Aligned
Substantive Model Semantic
static data
Documented
Protocol Syntactic
data
System
Technical Communication Technical
specific data
Hardware
None
Fig ur e 14.2
Evolution of levels of interoperation. (From Tolk, A., What comes after the semantic web: PADS
implications for the dynamic web, Proceedings of the 20th Workshop on Principles of Advanced
Distributed Simulation, 57, IEEE Computer Society, Washington, DC, 2006.)
364 Conceptual Modeling for Discrete-Event Simulation
The LCIM has been applied in different communities. Wang et al. (2009)
show the descriptive and prescriptive potential of the LCIM and evaluate
a first set of specifications, in particular those defined by the simulation
interoperability standards IEEE 1516, the High Level Architecture, and the
Base Object Models (BOM) standard recently developed by the Simulation
Interoperability Standards Organization (SISO). It is used in the following
section to support the recommended engineering methods for conceptual
modeling.
This section reviewed composability as it is currently understood and
showed how it is related to yet different from integratability and interoper-
ability. The next section discusses how to capture data, processes and con-
straints using engineering methods.
Out of the four steps, data management is the most important step in data
engineering and the most studied in the literature given that is the one that
demands the most effort. To manage data, metadata registries have been
defined to support the consistent use of data within organizations or even
across multiple organizations. In addition, they need to be machine-under-
standable to maximize their use. Logically, the recommended structures for
metadata registries are strong candidates for capturing the results of concep-
tual modeling for information exchange.
This work is supported by standards, such as Part III of the ISO/IEC 11179
“Metadata Registry (MDR)” standard that is used to conduct data manage-
ment. Using this standard, four metadata domains are recommended to cap-
ture information on data representation and data implementation, which are
summarized in Figure 14.3:
Having Specifying
Property Conceptual
0..* 1..1
domain domain
Property
instance Property value
0..* 1..1
domain
(Value of a property Represented
defining a concept) by … (Assignable values)
Representing
Fig ur e 14.3
Domains of information exchange modeling.
Traditionally, only property instances are captured. From what has been
specified in this chapter so far it becomes obvious that information needs to
be specified in the context of the results of the underlying conceptualization
to ensure the required semantic transparency. As the referent itself cannot
be captured because it is replaced by its conceptualization, the information
exchange model must at least capture the conceptualization of the model
used and cannot be limited to the symbols used for its implementation. Tolk
Conceptual Modeling for Composition of Model-Based Complex Systems 367
A A B B
C P P C
A A B B
Fig ur e 14.4
MBDE data administration.
CRM
C
A A CRM B B
C P P P C
A A B B
PVD IEE IEE PVD
Fig ur e 14.5
MBDE data management.
of the information exchange capability models can be mapped and that can
be mapped to the elements of the information exchange need models. In the
case of MBDE, these propertied concepts build the CRM, which is the logi-
cal model of the information exchange, or conceptualization, that can take
place between the model-based solutions. It is worth stating that this shows
that such a logical CRM always exists, whether it is made explicit or not, as
whenever a property from system A is mapped to a property of system B.
This constitutes a property that makes sense in the information exchange
between the two systems, which is expressed by the CRM. The concepts of
the CRM serve two purposes:
• They build the propertied concepts of the properties of the CRM and
as such help the human to interpret the information exchange cat-
egories better. In particular when a CRM is derived from the infor-
mation exchange requirements, this is very helpful.
• They conserve the context of information exchange for the receiving
systems, which is their information exchange need.
Figure 14.5 shows the result of the data management process for two systems
(or services) A and B.
Conceptual Modeling for Composition of Model-Based Complex Systems 369
CRM
C
A A CRM B B
C P P P C
A B
IEE IEE
A CRM B
PVD CRM
PVD
Fig ur e 14.6
MBDE data alignment.
370 Conceptual Modeling for Discrete-Event Simulation
CRM
C
A A CRM B B
C P P P C
A B
IEE IEEB=fCRM(IEEA) IEE
A CRM B
PVD CRM PVD
Fig ur e 14.7
MBDE data transformation.
Conceptual Modeling for Composition of Model-Based Complex Systems 371
f(i) g(i)
Assumed to
Model be aligned System
Fig ur e 14.8
Evaluating compatibility of assertion lists. (Adapted from King, R. D., On the Role of Assertions
for Conceptual Modeling as Enablers of Composable Simulation Solutions, PhD thesis, Old
Dominion University, Norfolk, VA, 2009.)
374 Conceptual Modeling for Discrete-Event Simulation
In order to apply these ideas, three steps need to be conducted, that are
captured in the following list. The viewpoint is slightly different from the
data and process engineering areas, but the results are comparable.
King (2009) shows with these steps that it was possible to capture concep-
tual misalignments of services that did not show up on the implementation
level. In other words: without capturing the results of conceptual modeling
appropriately, these services would not have been identified as not compos-
able, and a composition of them would have been technically valid but con-
ceptually flawed, leading to potentially wrong results. Depending on the
application domain the simulation composition uses, such conceptual mis-
alignment can lead to significant harm and even the loss of human lives or
economic disasters, in particular when decision makers base their decisions
on flawed analysis.
It is worth mentioning that the rigorous application of mathematics ensures
the consistency of conceptualizations and their implementations and not
that the conceptualizations are correct. In other words, it is possible to evalu-
ate if two different conceptualizations can be aligned to each other and if
376 Conceptual Modeling for Discrete-Event Simulation
Strong semantics
Axiology
Modal logic
Logical theory
First order logic
Description logic
UML
RDF, RDF/S
Taxonomy
Topic map, object model
Weak semantics
Fig ur e 14.9
Ontological spectrum and methods.
conceptualization, and as they are formal, machines can read them and rea-
son about them. West (2009) gives an example for practical applications of
ontologies in connection with modeling of data for a significant business.
A more theoretic introduction to the use of ontological means has recently
been published by Guizzardi and Halpin (2008) in their special issue on con-
ceptual modeling in the Journal of Applied Ontologies. The discussion on onto-
logical means in support of systems engineering are ongoing.
Recker and Niehaves (2008) broadened the scope, limits and boundaries
regarding ontology-based theories for conceptual modeling beyond the pure
technical discussions by focusing on three additional questions:
14.5 Conclusion
To be able to decide if two model-based solutions are composable to support
a common objective, implementation details of those solutions alone are not
sufficient. In order to identify applicable solutions, select the best ones in the
context of the task to be supported, compose the identified set of solutions,
and to orchestrate their execution, conceptual models in the form of for-
mal specifications of conceptualizations of data, processes, and constraints
are mandatory. This formal specification must be captured as metadata.
The methods in the ontological spectrum can be used to capture the meta-
data in machine-readable form. Without such annotations for model-based
solutions that capture the results of the conceptual modeling phase as
machine-understandable metadata, the concepts of system of systems and
service-oriented architectures will remain incomplete.
Conceptual modeling for composition of model-based complex systems
supported by the methods of data engineering, process engineering, and
constraint engineering produces the metadata necessary to enable the loss-
less mediation between viewpoints as captured in the conceptualization of
different models. We documented the necessary step of data engineering in
detail and motivated the feasibility of similarly detailed methods for pro-
cesses and constraints.
The rigorous use of mathematics to produce metadata for the annotation
of model-based solutions is a necessary requirement to enable consistent sys-
tem of systems solutions or service-oriented architectures. The solutions pro-
vided by current standards as discussed in this chapter are not sufficient.
Conceptual Modeling for Composition of Model-Based Complex Systems 379
References
Allen, J. 1983. Maintaining knowledge about temporal intervals. CACM 26 (11):
832–843.
Benjamin, P., K. Akella, and A. Verma. 2007. Using ontologies for simulation integra-
tion. In Proceedings of the Winter Simulation Conference, 1081–1089. Washington,
DC: IEEE Computer Society.
Burstein, M. H., and D. V. McDermott. 2005. Ontology translation for interoperability
among Semantic Web services. AI Magazine 26(1) : 71–82.
Dahmann, J. S. 1999. High Level Architecture Interoperability Challenges. Presentation
at the NATO Modeling & Simulation Conference, Norfolk, VA, October 1999,
NATO RTA Publications.
Davis, P. K., and R. K.Huber. 1992. Variable-resolution combat modeling: Motivations,
issues, and principles. RAND Notes, Santa Barbara, CA.
Davis, P. K., and J. H. Bigelow. 1998. Experiments in MRM. RAND Report MR-100-
DARPA, Santa Barbara, CA.
Davis, P. K., and R. Hillestad. 1993. Families of models that cross levels of resolu-
tion: Issues for design, calibration and management. In Proceedings of the Winter
Simulation Conference, 1003–1012. Washington, DC: IEEE Computer Society.
Dori, D. 2002. Object Process Methodology: A Holistic Systems Paradigm. Berlin,
Heidelberg, New York: Springer Verlag.
Fishwick, P. A. 2007. Handbook of Dynamic System Modeling. Chapman & Hall/CRC
Press LLC.
Gruber, T. R. 1993. A translation approach to portable ontology specification. Journal
of knowledge acquisition 5:199–220.
Guizzardi, G., and T. Halpin. 2008. Ontological foundations for conceptual modeling.
Journal of applied ontology 3(1–2):1–12.
Hofmann, M. 2004. Challenges of model interoperation in military simulations.
Simulation 80:659–667.
Institute of Electrical and Electronics Engineers. 1990. A Compilation of IEEE Standard
Computer Glossaries. New York: IEEE Press.
Institute of Electrical and Electronics Engineers IEEE 1278 Standard for Distributed
Interactive Simulation
Institute of Electrical and Electronics Engineers IEEE 1516 Standard for Modeling and
Simulation High Level Architecture
International Organization for Standardization (ISO)/International Electrotechnical
Commission (IEC). 2003. Information technology metadata registries part 3:
registry metamodel and basic attributes. ISO/IEC 11179–3:2003.
Jackson, M. C., and P. Keys 1984. Towards a system of systems methodology. Journal
of the operations research society 35(6): 473–486.
King, R. D. 2007. Towards conceptual linkage of models and simulations. In
Proceedings of the Spring Simulation Interoperability Workshop. Washington, DC:
IEEE Computer Society.
King, R. D. 2009. On the role of assertions for conceptual modeling as enablers of
composable simulation solutions. PhD thesis, Old Dominion University,
Norfolk, VA.
380 Conceptual Modeling for Discrete-Event Simulation
Tolk, A. 2006. What comes after the Semantic Web: PADS implications for the dynamic
Web. In Proceedings of the 20th Workshop on Principles of Advanced and Distributed
Simulation, 55–62. Washington, DC: IEEE Computer Society.
Tolk, A., and R. D. Aaron. 2010. Model-based data engineering for data-rich integra-
tion projects: Case studies addressing the challenges of knowledge transfer.
Engineering management journal 22(2) July (in production).
Tolk, A., and S. Y. Diallo. 2005. Model-based data engineering for Web services. IEEE
internet computing 9(4) July/August: 65–70.
Tolk, A., S. Y. Diallo, and C. D. Turnitsa. 2008. Mathematical models towards self-
organizing formal federation languages based on conceptual models of infor-
mation exchange capabilities. In Proceedings of the Winter Simulation Conference,
966–974. Washington, DC: IEEE Computer Society.
Tolk, A., S. Y. Diallo, and C. D. Turnitsa. 2009. Data engineering and process engi-
neering for management of M&S interoperation. In Proceedings of the Third
International Conference on Modeling, Simulation and Applied Optimization, Sharjah,
U.A.E, January 20–22, 2009.
Turnitsa, C. D. 2005. Extending the levels of conceptual interoperability model. In
Proceedings Summer Computer Simulation Conference. Washington, DC: IEEE
Computer Society.
Wang, W., A. Tolk, and W. Wang. 2009. The levels of conceptual interoperability
model: Applying systems engineering principles to M&S. In Proceedings of the
Spring Simulation Multiconference. Washington, DC: IEEE Computer Society.
West, M. 2009. Ontology meets business. In Complex Systems in Knowledge-Based
Environments: Theory, Models and Applications, SCI 168, ed. A. Tolk and L. C. Jain,
229–260. Heidelberg: Springer.
Yilmaz, L. 2004. On the need for contextualized introspective simulation models to
improve reuse and composability of defense simulations. Journal of defense mod-
eling and simulation 1(3):135–145.
Yilmaz, L., and S. Paspuleti. 2005. Toward a meta-level framework for agent-supported
interoperation of defense simulations. Journal of defense modeling and simulation
2(3): 61–175.
Zeigler, B. P. 1986. Toward a simulation methodology for variable structure modeling.
In Modeling and Simulation Methodology in the Artificial Intelligence Era, ed. M. S.
Elzas, T. I. Oren, and B. P. Zeigler, 195–210. Amsterdam: Elsevier Scientific Pub.
Zeigler, B. P., H. Praehofer, and T. G. Kim. 2000. Theory of Modeling and Simulation:
Integrating Discrete Event and Continuous Complex Dynamic Systems. Amsterdam:
Academic Press, Elsevier.
15
UML-Based Conceptual Models and V&V
Contents
15.1 Introduction...............................................................................................384
15.2 Verification and Validation of Conceptual Models for
Simulations................................................................................................ 385
15.2.1 V&V............................................................................................... 385
15.2.2 V&V Process................................................................................. 386
15.2.3 V&V Techniques.......................................................................... 388
15.3 Verification of UML-Based Conceptual Models................................... 389
15.3.1 Desirable Properties for UML-Based Models......................... 389
15.3.2 Formal Techniques for UML CM Verification........................ 392
15.3.2.1. Approaches with Structural Emphasis................... 392
15.3.2.2. Approaches with Behavioral Emphasis.................. 394
15.3.3 Tool Support for UML-Based Conceptual Model
Verification................................................................................... 395
15.3.4 Inspections and Reviews for UML Model Verification......... 396
15.4 An Inspection Approach for Conceptual Models in a
Domain-Specific Notation....................................................................... 397
15.4.1 Need for a Systematic Inspection Method.............................. 397
15.4.2 Desirable Properties for UML-Based KAMA Notation........ 399
15.4.3 An Inspection Process................................................................ 399
15.4.3.1. Intradiagram Inspection...........................................400
15.4.3.2. Interdiagram Inspection...........................................405
15.5 Case Studies............................................................................................... 407
15.5.1 Case Study 1................................................................................ 407
15.5.1.1. The Setting..................................................................408
15.5.1.2. Conduct of the Case Study 1....................................408
15.5.1.3. Discussion and Findings of Case Study 1.............. 410
15.5.2 Case Study 2................................................................................ 411
15.5.2.1. The Setting.................................................................. 411
15.5.2.2. Conduct of the Case Study 2.................................... 411
15.5.2.3. Findings of the Case Study 2.................................... 413
15.6 Conclusions and Further Research........................................................ 413
References.............................................................................................................. 415
383
384 Conceptual Modeling for Discrete-Event Simulation
15.1 Introduction
Although there is no consensus on a precise definition of a conceptual model
(CM), it is generally accepted that a CM is an abstract representation of a
real-world problem situation independent of the solution. This representa-
tion may include entities, their actions and interactions, algorithms, assump-
tions, and constraints.
Recently, there has been a growing tendency to adapt UML for different
modeling needs and domains. Having various representation capabilities,
being a multipurpose modeling language and allowing extension mecha-
nisms, UML seems to be promising for conceptual modeling as well.
However, as there is a lack of an agreed definition of a conceptual model, it is
difficult to define a best set of UML views for representing it.
Nevertheless, in the military simulation domain, which constitutes one of
the major areas of use of conceptual modeling, three approaches that support
simulation conceptual modeling based on UML have emerged: The first one,
Syntactic Environment and Development and Exploitation Process (SEDEP
2007) is HLA (High Level Architecture) oriented. Two UML profiles have
already been developed toward tool support (Lemmers and Jokipii 2003).
The second one, BOM (Base Object Model) (BOM 2006) has been developed
by SISO (Simulation Interoperability Standards Organization). BOMs are
defined to “provide an end-state of a simulation conceptual model and can be
used as a foundation for the design of executable software code and integra-
tion of interoperable simulations” (BOM 2006). Hence, BOMs are closer to the
solution domain and the developer rather than the problem domain and the
domain expert. The third one is the KAMA notation (Karagöz and Demirörs
2007), which is more CMMS (Conceptual Model of the Mission Space) ori-
ented and platform independent. CMMS is defined as “simulation-imple-
mentation-independent functional descriptions of the real world processes,
entities, and environment associated with a particular set of missions” by
DMSO (U.S. Defense Modeling and Simulation Office) (2000b, Karagöz and
Demirörs 2008). KAMA has been revised through experimental processes
and empirically shown to be fit for CMMS purposes (Karagöz and Demirörs
2008). A CMMS serves as a bridge between subject matter experts (SME) and
developers. In CMMS development, SMEs act as authoritative knowledge
sources when validating mission space models.
The roles of various parties involved in the scope of CM verification as
discussed in this chapter will be defined as follows:
In this chapter, we will mostly deal with the verification of CMMS models
developed with the UML-based KAMA notation. The general problem with
utilizing a UML-based notation is that, in addition to defects and omissions
that may be introduced during translation of problem domain to a concep-
tual model, semiformality of UML (Kim and Carrington 2000, Ober 2004), its
support of multiple views, and its extension mechanism increases the risk of
inconsistency, incorrectness, and redundancy. Furthermore, the specification
of UML (UML Superstructure 2005) does not provide a systematic treatment
of correctness, consistency, completeness and redundancy issues in models.
The rest of the chapter is organized as follows: First, we describe verifica-
tion and validation (V&V) in the context of CM. Then, desirable properties to
be used in verification of UML models are presented along with a summary
of research in two main streams: formal and informal approaches. Then, as a
candidate for informal V&V, an inspection process to address semantic prop-
erties is presented. Next are two case studies, illustrating how that inspec-
tion approach helps to identify logical defects as well as some important
semantic issues to be consulted a SME. The last section concludes the chapter
with an evaluation of the described techniques.
Conceptual modeling
Real-world
problem domain Consistency and
property verification
Validation
Validation
Executable Conceptual
model Verification model
Implementation
Fig ur e 15.1
Conceptual model validation and verification.
hand, NATO (2007), for instance, focuses specifically on V&V for federations
that are developed according to the FEDEP (2000) (Federation Development
and Execution Process). It considers verification, validation, and accredita-
tion (VV&A) activities as an “overlay” process to FEDEP, whereas the ITOP
(2004) approach aims at supporting the capture, documentation, reuse, and
exchange of V&V information. Finally, the REVVA 2 (2005) methodology is
intended to provide a generic VV&A framework. In spite of having different
focuses, these resources share some common concepts.
They agree that, in the first stages of M&S development, a vague intended
purpose must be formulated, which is refined into a set of subpurposes, which
again must be decomposed to a set of Acceptability Criteria (AC). Hence, pass-
ing the AC implies fitness for the intended purpose (for UML-based models
these may be stated as properties). From AC, V&V objectives can be derived.
V&V objectives are usually decomposed into more concrete V&V tasks. The
execution of these tasks produces items of evidences to substantiate the AC.
When a modeling notation such as UML or KAMA is used, AC and associ-
ated V&V objective formulation for the conceptual models should also take
into account the set of representational and abstraction capabilities of the
modeling notation. For example, if the purpose of conceptual modeling is
just to provide a generic repository for reuse then the set of criteria will not
focus on implementation requirements but rather on understandability, easy
adaptation for reuse etc. On the other hand, if the conceptual model is to be
used straight in FEDEP, run-time criteria should be defined also.
The following set of general principles for simulation V&V, originally
suggested by Balci (1998), can be followed during CM V&V:
One striking observation one can make is that the accepted methodology or
guidelines by the simulation V&V community (DMSO 2000a, FEDEP 2000,
ITOP 2004) is that validation and accreditation of simulations is addressed
in detail and extensively. However, internal verification of conceptual mod-
els is not explicitly addressed. This may be partly justified by the fact that
during simulation software development projects, SQA is already performed
388 Conceptual Modeling for Discrete-Event Simulation
Informal techniques are easy to use and understand with checklists, man-
uals, and guidelines. They may be effective if applied consistently and are
relatively less costly. Furthermore, informal V&V techniques may be used at
any phase of the simulation development process including conceptual mod-
eling. Static techniques can reveal a variety of information about the struc-
tural inconsistencies, data and control flow defects and syntactical errors. On
the other hand, dynamic techniques find defects in behavior and results of
the model execution. Finally, formal techniques rely on a formal process of
symbol manipulation and inference according to well-defined proof rules of
the utilized formal language. They are very effective, but costly due to their
complexity and sometimes due to the size of the model under consideration.
Many formal techniques are either unusable except in trivial examples or
require an understanding of complex mathematics (Garth et al. 2002).
Validation of conceptual models is usually informal and consists of SME
reviews or audits, self inspection, face validation, etc. In addition, Sargent
(2001) mentions the use of traces, again an informal technique. The use of
traces is the tracking of entities through each submodel and the overall
model to determine if the logic is correct and if the necessary accuracy is
maintained. If errors are found in the conceptual model, it must be revised
and conceptual model validation should be performed again. Furthermore,
CM validation can be also performed using Simulation Graph Models (Topçu
2004). These are mathematical representations of the state variables, events
that change the state, and the logical and temporal relationships between
events.
On the other hand, UML-based CMs are also prone to errors; however,
well-known resources (DMSO 2000a, NATO 2007, ITOP 2004, REVVA 2005)
do not provide any guidance specific to UML-based CM verification. In the
following section we review a wider category of research related to UML
model verification.
Some of semantic properties are applicable only for specific kinds of dia-
grams. Csertan et al. (2002), for example, verify general properties defined
for state diagrams. In their study, properties defined by Levenson (1995) are
UML-Based Conceptual Models and V&V 391
used. Among the defined properties are (1) “All variables must be initialized,”
and (2) “All states must be reachable.”
Engels et al. (2001) mention horizontal and vertical UML consistency
properties. They acknowledge that horizontal consistency properties are
desired and may be a means to reduce contradictions that might exist due
to overlapping information residing in different views of the same model.
An example of a property related to horizontal consistency is: (1) “Each
class with object states must be represented with a state-chart diagram.”
They also mention about vertical consistency properties used to reduce
inconsistencies that may exist among different abstraction levels. An
example for this type of property is: (2) “The set of states of an object
defined by a parent class must be a subset of the set of states of an object
of the child class.” Some research studies (Kurzniarz et al. 2002, Kurzniarz
et al. 2003, Van der Straten et al. 2003) have formally defined these kinds
of properties.
Ambler (2005) lists a collection of conventions and guidelines for cre-
ating effective UML diagrams and defines a set of rules for develop-
ing high-quality UML diagrams. In total, 308 guidelines are given with
descriptions and reasoning behind each of them. It is argued that, applying
these guidelines will result in an increased model quality. However, inter-
view properties are not considered at all. Some examples of properties are
(1) “Model a dependency when the relationship is transitory in a structural
diagram,” (2) “Role names should be indicated on recursive relationships,”
and (3) “Each edge leaving a decision node should have a guard.” A simi-
lar approach is used by SD Metrics tool (2007), which checks adherence
to some UML design rules. Rules extend from well-formedness rules of
UML to object-oriented heuristics collected from the literature. Most of
the rules are simple syntactic rules. Some examples of errors detected are
(1) the class is not used anywhere, (2) use of multiple inheritance class
has more than one parent, and (3) the control flow has no source or target
node, or both.
On the other hand, a perspective-based reading method for UML design
inspection, so-called object-oriented reading techniques, have been pre-
sented by Travassos et al. (2002). Examples of properties provided are
(1) there must be an association on the class diagram between the two
classes between which the message is sent, and (2) for the classes used in
the sequence diagram, the behaviors and attributes specified for them in
the class diagram should make sense ( sic Travassos et al. 2002). Another
informal approach is suggested by Unhelkar (2005). Quality properties
within and among each diagram type have been described along with
checklists for UML quality assurance. Although conceptual modeling
(CIM, Computation Independent Model) is considered separately and
verification and validation checklists in different categories such as aes-
thetics, syntax and semantics are provided, most of the checklist items
are related to validation and completeness. Examples of the properties are
392 Conceptual Modeling for Discrete-Event Simulation
(1) the notation for fork and join nodes should be correctly used to rep-
resent multithreads, and (2) the aggregations should represent a genuine
“has a” relationship.
This section has briefly described different type of properties by giving
examples from various studies in the literature. It is clear that each of these
studies consider only certain type of properties and there is a lack of agree-
ment on a set of desirable properties for quality UML CMs.
+Launched +From
Missile Launcher
0..* 1
Origin
Fig ur e 15.2
Semantically incorrect UML class diagram example.
UML-Based Conceptual Models and V&V 393
launcher” classes are disjoint, i.e., they can not have common instances as
imposed by the generalization relation. But formally, because of assertions
2 and 4, the knowledge base in Figure 15.3 becomes inconsistent.
It is in general possible to translate FOL statements to an input lan-
guage of an inference engine such as Prolog to check incrementally the
consistency conditions. However, the general decision problem of valid-
ity in FOL is undecidable. In order to overcome this, a fragment of FOL,
called Description Logics can be used for representing CMs. As an exam-
ple, Berardi, Calvanese, and De Giacomo (2005) and Van der Straten et al.
(2003) rely on the transformation of UML models into description logics. As
opposed to FOL, subsets of description logics, which can be used for seman-
tic consistency of only a restricted subset of class diagrams, have decidable
inference mechanisms. By exploiting the services of description logic infer-
ence engines for example (ICOM 2000), various kinds of checks for proper-
ties can be performed. Among these are the properties such as consistency
of a class diagram.
Inference engine Sherlock, for example (Caplat 2006), linked to a UML
case tool is used. In this work, models are built using a UML-modeling tool
with tags and constraints. As a lighter alternative, instead of first describ-
ing the MOF (Meta Object Facility), they have chosen to describe the UML
metamodel directly in the inference engines language. Next, models are
expressed in terms of this metamodel. UML metamodel and generic rules
are added. Finally, the UML model is loaded and checked.
Dupey (2000) has proposed to generate formal specifications in the Z
language with proof obligations from UML diagrams. This is done auto-
matically with the RoZ tool. UML notations and formal annotations reside
together: the class diagram provides the structure of Z formal skeleton while
details are expressed in forms attached to the diagram. Either OCL or Z-Eves
constraints are used. Then the Z-Eves theorem prover is used to validate a
given diagram.
Similarly, Marcano and Levy (2002) describe an approach for analysis and
verification of UML/OCL models using B formal specifications. In this work,
a systematic translation of UML class diagrams and OCL constraints of a
system into a B formal specification is given. They propose to manipulate
a UML/OCL model and its associated B formal specification in parallel. At
first a B specification is derived from UML class diagrams. Then, OCL con-
straints of the model are automatically translated into B expressions. Two
394 Conceptual Modeling for Discrete-Event Simulation
types of constraints are taken into account: invariants specifying the static
properties, and pre-/post-conditions of operations specifying the dynamic
properties. The objective is to enable the use of automated proof tools avail-
able for B specifications in order to analyze and verify the UML/OCL model
of a system.
Andre, Romanczuk, and Vasconcelos (2000) have presented a translation of
UML class diagram into algebraic specification in order to check consistency.
They aim to discover inconsistent multiplicities in a class diagram and deal
with important concepts of UML class diagrams: class, attribute, association,
generalization, association constraints and inheritance. The theorem prover
used discovers some of the inconsistencies automatically; others require the
intervention of the user.
tool that assists in automating the validation process and consistency check
algorithm.
Recently, Gagnon et al. (2008) presented a framework supporting for-
mal verification of concurrent UML models using the Maude language. In
spite of its relatively limited scope of applicability, the interesting aspect of
this research is that both static and dynamic features of concurrent object-
oriented systems depicted in UML class, state and communication diagrams
are considered, unlike the majority of similar studies that adopt a single
perspective.
Lilius and Paltor (1999) for example developed vUML, a tool for
a utomatically verifying UML models. UML models are translated into the
Process Meta Language (PROMELA) and model-checked with the SPIN
model checker. The behavior of the objects is described using UML stat-
echart diagrams. The user of the vUML tool neither needs the know how to
use SPIN nor PROMELA. If the verification of the model fails, a counterex-
ample described in UML sequence diagrams is generated. The vUML tool
can check that a UML model is free of deadlocks and livelocks as well as
that all the invariants are preserved. In general, the translation employed
is not trivial.
Other tools for verification exist and each one here implements a particu-
lar kind of semantic property checking (Statemate-Magnum 2007, Tabu 2004,
Eishuis and Weringua 2004, Schinz et al. 2004), adopting a particular formal-
ism. Hence, complexity and semantic correspondence problems remain to
be tackled.
to offer more procedural support in which scenarios are derived from defect
types. A scenario describes how to find the required information to reach a
possible defect.
Few studies have been published in the area of inspection of UML mod-
els. Travassos et al. (2002) describe a family of software reading techniques
for the purpose of defect detection of high-level object-oriented designs rep-
resented using UML diagrams. This method is a type of perspective-based
reading for UML design inspection and can be considered as following the
line of techniques discussed by Basili et al. (1996). Object-Oriented Reading
Techniques consist of seven different techniques that support the reading of
different design diagrams. This method is composed of two basic phases. In
the horizontal reading phase, UML design artifacts such as class, sequence
and state chart diagrams are verified for mainly interdiagram consistency.
In the vertical reading, design artifacts are compared with requirements
artifacts such as use case description for design validation. Hence most of
the properties checked in these studies are related to validation and the main
artifact considered is software design rather than a conceptual model.
An important book on UML quality assurance (Unhelkar 2005) describes
quality properties within and among each diagram type along with check-
lists for UML quality assurance. The foundation for quality properties are
set by the discussion on the nature and creation of UML models. This is fol-
lowed by a demonstration of how to apply verification and validation checks
to these models with three perspectives: syntactical correctness, semantic
meaningfulness, and aesthetic symmetry. The quality assurance is carried
out within three distinct but related modeling spaces: (1) model of problem
space (CIM in MDA terms), (2) model of solution space (Platform indepen-
dent model), and (3) Model of background space (Platform-specific model).
This makes it easier for the inspectors to focus on the appropriate diagrams
and quality checks corresponding to their modeling space. Although CIM
(Computation Independent Model) is considered separately and verification
and validation checklists in different categories such as aesthetics, syntax,
and semantic are provided, most of the checklist items are related to com-
pleteness. Items related to verification are mostly syntax, static semantic, or
simple cross-diagram dependency checks.
model under consideration (Garth et al. 2002). And many of the studies based
on transformation to formal languages are restricted to one or two types of
diagrams. Hence, only certain dynamic aspects are analyzed with Petri Nets
for example. Moreover, the formalism also restricts the type of properties to
be checked.
On the other hand, when a UML-based notation is used for conceptual mod-
eling, mapping of UML diagrams into a formal notation yields the semantic
correspondence issue. Besides, most of the formal techniques assume at least
a predefined completeness in models. However conceptual models, unlike
design models, are developed in a sketchy manner at the initial phase of the
requirements elicitations and may be incomplete in various ways that are
difficult to determine in advance.
Furthermore, as conceptual models are in general not executable, it is
not easy to use dynamic techniques either. Conceptual models are used
primarily as a means of communication, and the term conceptual inher-
ently implies tractable abstraction levels and size. Consequently, tech-
niques such as walkthroughs and inspections can be used rigorously for
assuring conceptual model quality. It may also be cost effective to inte-
grate the verification tasks with the validation tasks that require human
interpretation.
Figure 15.4 summarizes the advantages and disadvantages of both the for-
mal approaches and informal approaches for CM verification. An inspection
approach may be preferred to a formal approach due to various advantages:
First, informal techniques are easy to use and understand.
Their application is straightforward. As checklists and guidelines are
the main sources, they can be performed without any training in mathe-
matical software engineering. Inspections may be very effective if applied
Disadvantages Disadvantages
Fig ur e 15.4
Comparison of inspections to formal verification for UML-based CM.
UML-Based Conceptual Models and V&V 399
rigorously and with structure and they are relatively less costly and they
can be used at any phase of the development process. Hence, a systematic
and holistic approach, rather than using formalisms, may provide significant
practical results. In the following section, an appropriate inspection process
is described to assure the quality of conceptual models in a notation derived
from UML.
The second one is formed by inheriting B from D rather than D from B. In this case,
since D is a type of C and C is a type of A, B becomes a type of A by transitivity.
However, by disjointness constraint B cannot have instances that C has, hence this
again forms a contradiction. Such situations should be reviewed.
Fig ur e 15.5
Patterns developed based on strength of relations, generalization, and transitivity.
401
1.5 Generalization with aggregation: This pattern is based on the observation that uti-
402
A B
lization of different types of relations for same concepts may be a source of redundancy
or contradiction. In this case, if there is a generalization relation between A and B
and if an aggregation or a composition from A to B is defined, this forms a very rare
A B
pattern. E.g., a chicken-and-egg kind of ontology. So, this should be identified as a
redundancy warning to be reviewed.
Disjoint or 1.6 Disjoint or overlapping with aggregation: This pattern is based on disjointness and
overlaping overlapping constraints which can be defined on a set of generalization relations. There
A B are two main cases. The first case occurs when classes B and C are disjoint, but there is a
composition or aggregation relation between them. This forms a possible contradiction
because this is equivalent to saying that B and C have no common instances but B is com-
posed of C. Remark that if B is composed of C and only C, this pattern will result in a con-
tradiction. If there were other classes that B is composed of and which are not inherited
C from A, this pattern would not cause a contradiction. The second case occurs when class
B and C are overlapping, that is they have common instances and there is a composition
relation between B and C. Hence, overlapping constraint becomes a redundancy. The first
case should be reviewed.
Given If 2.2 Contradiction by asymmetry: This pattern is based on the asymmetry property of
A B A B directed relations, such as composition, aggregation and generalization. For classes
A and B, all the classes of the inheritance hierarchy of A and B should be checked for
identification of this pattern. In this case, the utilization of a relation with direction
A B A B between classes in opposite directions may be signaled as a possible contradiction
and should be reviewed.
A B A B
UML-Based Conceptual Models and V&V
A B 2.4 Association constraint (XOR) with association: This pattern is developed consider-
0..1 1 ing the possible constraints that can be used on a set of association relations. These
0..1 are XOR, NAND, or similar constraints. If any association relation between B and C is
{XOR} defined, then an immediate contradiction arises because B and C can never exist at the
same time because of the XOR constraint. Hence, these situations should be identified
as contradictions.
C
1..*
Fig ur e 15.6
Patterns developed based on asymmetry and deep inheritance.
403
A B 2.5 Inherited constraint: This pattern is developed based on the concept of deep
404
0..1 1 inheritance. In this case, it is assumed that the multiplicities should be inherited
0..* from the base class. Hence, the multiplicity range of the base class should be wider
than that of the inherited class. If this kind of structure is encountered, the lowest(l)
C and highest(h) multiplicities of B and C must be checked. If “l(B c) ≤ l(C b) and h(Bc) ≥
1..* h(C b) and l(Ab) ≤ l(Ac) and h(Ab) ≥ h(Ac)” ≠ TRUE than multiplicities of B and C form
a contradiction.
A (Ground vehicle) 2.6 Inherited constraints II: This pattern is similar to “inherited constraint I.” If this
B (Tyres)
kind of structure is encountered, the lowest and highest multiplicities of A and C and B
and D must be checked. If l(Ba) = < l(Dc) and h(Ba) => h(Dc) and l(Ab) =< l(Cd) and h(Ab)
=> h(Ac) <> TRUE, multiplicities of B and D or A and C form a contradiction.
Table 15.1
Structural Diagram Inspection Phase
1. Check syntactical errors such as omissions, missing attributes, and name clashes, based on
the syntactic rules.
2. Look for deficiency patterns in the class model.
2.1 Look for a match with each pattern for a contradiction or a redundancy. Consider the
transitive closure of the relations for pattern matching.
2.2 Depending on the matched pattern validate the issue with the SME.
3. Identify complex structures (structures with central classes participating in more than one
relation and/or relationship type) not considered in task 2 by using the semantics of the
modeling elements forming the structure.
Table 15.2
Mission Space Diagram Inspection
1. Check syntactic errors such as duplicate names, dangling missions without actors.
2. Check for patterns 1.2 and 2.2 to identify contradictions.
3. Check the < inclusion > and < extends > relations for semantically correct usage.
3.1 Trace and check the relation to the refining task flow diagram of the use case to make
sure they are properly used.
Table 15.3
Task Flow Diagram Inspection Phase Tasks
1. Check for syntactic errors such as dangling nodes, initial nodes with more than one
outgoing transitions.
2. Identify decision nodes.
2.1 Check if all flows outgoing from the decision nodes have guards.
2.2 Check the constraints on the guards to make sure that they do not overlap (overlapping
such as constraint on one guard is x > = 0 and on the other x = < 0).
2.3 Check if the guards define a complete set (such as x = > 0 and x < 0).
2.3.1 Identify overlapping and incomplete conditions.
3. Identify fork nodes.
3.1 Check if the fork node has only one entrance; if not, make sure that a task flow is not
missed before the flow is joined.
3.2 Check if all the flows from the fork node are joined by a (same) join node
(nonstructurally joined nodes or fork nodes may indicate concurrency problems).
3.2.1. If not, run the flows coming out of the fork node with UML’s activity diagram
(Petri Nets–like) control flow semantics.
3.2.2. Identify livelocks and their causes.
4. Identify join nodes.
4.1 Check if join nodes have only one exit transitions.
4.2 If not, it is possible that the join node is placed too early; there is possibility that there is
still a need for a parallel flow.
4.3 Trace incoming transitions of the join nodes to make sure that all may eventually be
activated.
4.4 If not identify causes of deadlock.
5. If the task flow is complex (includes more than one fork node or composite decision nodes)
trace each flow from the start to end.
5.1 Make sure that every task may execute.
5.2 Identify dead tasks.
6. Trace the flows reaching the final nodes.
6.1 Make sure that they do not originate from a fork node.
6.2 If they do, there is a possibility that some activities will terminate abruptly, try to
identify such activities.
7. Identify loops by tracing through transitions.
7.1 Run the localized loop with UML’s activity diagram (Petri Nets–like) control flow
semantics.
7.2 Identify possible livelocks and their causes.
8. Identify activities with <input> and <output> entities.
8.1 Make sure that if tasks use outputs of one another, they also follow the implied
sequence in the control flow because a produced entity may be an input for another
task, causing the task to never start or to prevent parallel flow.
8.2 Identify deadlocks or redundancy.
UML-Based Conceptual Models and V&V 407
Table 15.4
Interdiagram Inspection Tasks
1. Trace missions and check if they are modeled in task flow diagrams and vice a
versa.
2. Compare ontology diagrams with corresponding subontology diagrams and make
sure that there is only one subontology diagram for an entity in the upper ontology
diagram.
3. Identify further decomposed tasks in task flow diagrams, make sure there is only
one subtask flow diagram for a super task flow node.
4. Identify <inputs>, <outputs>in nonleaf task flow diagrams.
4.1 Trace <inputs>, <outputs> in the next lower task flow diagram.
4.2 Ensure that there is at least one <input> and/or <output> attached to the next
lower task flow and identify missing <inputs> and/or <outputs> for the next
lower task flow diagram.
5. Identify <input>, <outputs> entities in leaf task flow diagrams.
5.1 Trace <inputs>, <outputs> entities in the task flow in the upper task flow
diagram.
5.2 Check if there is at least one <input> and/or <output> attached to the upper
task flow and identify missing <inputs> and/or <outputs> in the leaf task flow.
6. Identify extended missions.
6.1 Compare task flow diagrams of the mission with task flow diagram of the
extended mission: the extended task flow diagram should be reachable by only
extracting model elements from extending diagram.
7. Check each <input> and <output> entity in task flow diagrams, a corresponding
entity has to exist in ontology diagrams.
8. Check all the actors in mission space diagrams are defined in organization
diagrams.
9. Check if variables used in state chart diagrams are defined attributes of
corresponding entity.
10. Check if operations used as transitions in entity state diagrams are defined in the
corresponding entity diagram.
Two modelers both experienced in UML modeling and KAMA notation had
developed a conceptual model for a typical mission scenario. The concep-
tual model consisted of one mission space diagram, one command hierar-
chy diagram, five ontology diagrams, and 46 task flow diagrams at varying
levels of structural decomposition with different levels of complexity and
included a total of 179 model elements. The model was in its early stage of
the CM development process (at the first iteration of three review stages)
and was developed in a sketchy manner. For example, the entities did not
include operations defined and any entity state diagrams. Hence during the
inspection, only a set of the inspection tasks could be performed. Semantic
checks with cardinalities for any of the structure diagrams were not nec-
essary because cardinalities were not used. Similarly, the consideration of
entity state diagram related properties were also left out of the scope of the
inspection.
The conceptual model inspection was conducted in two main phases.
Review of the conceptual model had been already performed informally
during conceptual model development phases by the two modelers. Our
inspection process was performed after this review. The defect detection
and reporting was conducted by an inspector. This phase took 20 person
hours. After the defect detection phase, an inspection meeting for validating
the defects detected was planned. The inspector, modeler and two software
engineering experts participated in this six-hour meeting. The outputs of
this process were the corrected conceptual model and the verification report.
Main sources of evidence and data of inspection were defect detection docu-
mentation and minutes of the inspection meeting.
<sub/sup>
Battalion command Station command tool
National command <sub/sup>
center
center <sub/sup>
<sub/sup>
Fig ur e 15.7
KAMA command hierarchy diagram with redundancy.
Main task flow: Develop Pointer Information Sub task flow: Compose Communication Intelligence
Main task flow: Watch the Mission Region Sub task flow: Target Recognition and Identification
<control flow>
<control flow>
Fig ur e 15.8
KAMA task flow diagram examples.
that, although some of the 85 issues signaled minor problems and some of
them were not definitive defects, 39 of the identified issues included behav-
ioral defects and were qualified as subtle and not easily detectable in ad hoc
reviews. Seventeen of these issues were agreed to be definitive defects and
22 issues were identified as definitive incompleteness. They also agreed that
these types of redundancies and contradictions are not easy to detect and
deficiency patterns could help the inspectors to detect requirements related
issues.
<<Mission>>
<<Play sea battle>>
<<Responsible>>
Sensor
<<Responsible>>
Commander <<Includes>> <<From actors>>
<<Mission>>
<<From actors>>
<<Calculate effect
of contextual
conditions>>
<<Responsible>>
<<Extends>> <<Extends>>
<<Extends>><<Extends>>
<<Responsible>>
Weapon
<<From actors>>
<<Responsible>>
<<Mission>> <<Mission>>
<<Mission>> <<Mission>> <<Mission>> <<Realize
<<Realize <<Realize <<Realize logistic personnel related
<<Play battle intelligence
operational missions>> missions>>
types>> missions>>
missions>>
Platform
<<From actors>>
<<Extends>>
<<Extends>>
<<Extends>>
<<Extends>>
Fig ur e 15.9
Conceptual Modeling for Discrete-Event Simulation
References
Amalio, N., and Polack, F. 2003. Comparison of formalization approaches of UML
class constructs in Z and Object-Z. In International Conference of Z and B Users (ZB
2003), LNCS 2561. Springer-Verlag.
Ambler, W. S. 2005. The Elements of UML 2.0 Style. Cambridge, MA: Cambridge
University Press.
Andre, P., A. Romanczuk, J. C. Royer, and I. Vasconcelos. 2000. Checking the
consistency of UML class diagrams using Larch Prover. In Proceedings of the
Third Rigorous Object-Oriented Methods Workshop. York, UK: BCS.
Argo. 2002. An open source UML case tool. Retrieved January 1996 from http://
argouml.tigris.org/.
Balcı, O. 1997. Verification, validation and accreditation of simulation models.
In Proceedings of the 1997 Winter Simulation Conference, ed. S. Andrad—ttir,
K. J. Healy, D. H. Withers, and B. L. Nelson. Atlanta, GA: IEEE.
Balci, O. 1998. Verification, validation, and accreditation. In Proceedings of 1998 Winter
Simulation Conference, ed. E. F. Watson, J. S. Carson II, and D. J. Medeiros.
Washington, DC: ACM.
Basili, V. R., S. Green, O. Laitenberger, F. Lanubile, F. Shull, S. Sorumgard, and
M. V. Zelkowitz. 1996. The empirical investigation of perspective-based read-
ing. Empirical software engineering journal 2(1):133–164.
Berardi, D., D. Calvanese, and G. De Giacomo. 2005. Reasoning on UML class
diagrams. Artificial intelligence 168:70–118.
Berenbach, B. 2004. Evaluation of large, complex, UML analysis and design mod-
els. In Proceedings of 26th International Conference on Software Engineering, ICSE.
Washington, DC: IEEE.
Boehm, B. W. 1984. Verifying and validating software requirements and design speci-
fications. IEEE Software 1(1):75–88.
BOM. 2006. Base object model. Retrieved October 2007 from http://www.boms.info/.
Brade D. A. 2004. Generalized process for the verification and validation of models
and simulation results. Dissertation, Fakultät für Informatik, Universität der
Bundeswehr München, Neubiberg.
Briand, L., Y. Labiche, and L. O’Sullivan, L. 2003. Impact analysis and change
management of UML models. Technical Report SCE-03-01. In Proceedings
of International Conference on Software Maintenance (ICSM). Washington, DC:
Carleton University: IEEE.
Caplat, G. 2006. Sherlock environment. Retrieved April 2006 from http://servif5.
insa-lyon.fr/chercheurs/gcaplat/.
Chang, L. P., D. Jong-Li, P. Lin-Yi, and J. Muder. 2005. Management and control of
information flow in CIM systems using UML and Petri Nets. International jour-
nal of computer integrated manufacturing 18:2–3.
Compatangelo, E. and H. Meisel, H. 2002. Intelligent support to knowledge manage-
ment: Conceptual analysis of EER schemas and ontologies. In Internal Report,
Dept. of Computing Science, University of Aberdeen, Aberdeen, UK. Retrieved
October 2007 from http://www.csd.abdn.ac.uk/research/conceptool/.
Csertan, G., I. Huszerl, Z. P. Majzik, and A. Patar. 2002. VIATRA: Visual Automated
Transformations for formal verification and validation of UML models. In
416 Conceptual Modeling for Discrete-Event Simulation
Hue, A., Y. San, and Z. Wang. 2001. Verifying and validating a simulation model. In
Proceedings of the 2001 Winter Simulation Conference. Arlington, VA: ACM.
Johnson, P., and D. Tjahjono. 1998. Does every inspection really need a meeting.
Journal of empirical software engineering 3(1):9–35.
Karagöz, A., and O. Demirörs. 2007. Developing Conceptual Models of the Mission
Space (CMMS): A meta-model based approach. In Proceedings of Simulation
Interoperability Workshop (SIW). Orlando, FL: SISO.
Karagöz, A., and O. Demirörs. 2008. A Conceptual Modeling Notation. Unpublished
doctoral dissertation, Middle East Technical University, Ankara, Turkey.
Killand, T., and J. Borretzen. 2001. UML consistency checking. Research Report
SIF8094. Institute for Datateknikk OG Informasjonsvitenskap, Trondheim,
Norway.
Kim, S., and D. Carrington. 2000. A formal mapping between UML models and
Object-Z specification and B. Lecture Notes in Computer Science, 1878, Berlin:
Springer.
Laitenberger, O., and J. M. DeBaud. 2000. An encompassing life-cycle centric survey
of software inspection. Journal of systems and software 50(1):5–31.
Law, A. M., and W. D. Kelton. 1999. Simulation Modeling and Analysis, 3rd ed.
New York: McGraw-Hill.
Lemmers, A., and M. Jokipii. 2003. SEST: SE Specifications Tool-set. In Proceedings of
Fall Simulation Interoperability Workshop. SISO.
Lilius, J., and I. P. Paltor. 1999. vUML: A tool for verifying UML models. Technical
report 272, Turku, Finland: Turku Centre for Computer Science (TUCS).
Lindland, O. I., G. Sindre, and A. Sølvberg. 1994. Understanding quality in concep-
tual modeling. IEEE software 11(2): 42–49.
Litvak, B., S. Tyszberowicz, and A. Yehudia. 2003. Behavioral consistency validation
of UML diagrams. In Proceedings of the 1st International Conference on Software
Engineering and Formal Methods. Brisbane, Australia: IEEE.
Marcano, R., and N. Levy. 2002. Using B formal specifications for analysis and ver-
ification of UML/OCL models. In Workshop on Consistency Problems in UML-
Based Software Development, 5th International Conference on the UML. Dresden,
Germany: IEEE.
Meta Edit. 2007. A case tool for domain specific software development. Retrieved
January 2007 from http://www.metacase.com.
Minas, M. 2002. Specifying graph-like diagrams with DiaGen, Electronic notes
in theoretical computer science 72(2), 102–111, Amsterdam, The Netherlands:
Elsevier.
MOF 2.0. 2004. Meta Object Facility core specification. Retrieved December 2005 from
http://www.omg.org.
Mota, E., M. Clarke, A. Groce, W. Oliveira, M. Falcão, and J. Kanda. 2004. Veri Agent:
An approach to integrating UML and formal verification tools. Electronic notes
in theoretical computer science 95:111–129.
Murata, T. 1989. Petri Nets: Properties, analysis and applications. Proceedings of the
IEEE (77).
NATO. 2007. Verification, validation, and accreditation of federations. Retrieved
November 2007 from http://www.rta.nato.int/search.asp#MSG-019.
Ober, I. 2004. Harmonizing design languages with object-oriented extensions and
an executable semantics. Unpublished doctoral dissertation, Institute National
Polytechnique de Toulouse, Toulouse, France.
418 Conceptual Modeling for Discrete-Event Simulation
OCLE. 2005. OCL Environment. Computer Science Research Laboratory, Babes Boyls
University, Romania. Retrieved December 2006 from http://lci.cs.ubbcluj.ro/
ocle/index.htm.
Ohnishi, A. 2002. Management and verification of the consistency among UML
models. In Proceedings of Workshop on Knowledge-Based Object-Oriented Software
Engineering (KBOOSE), LNCS. Malaga, Spain: Springer.
Open Architecture-ware. 2007. A platform for model driven development. Retrieved
October 2007 from http://www.openarchitectureware.org/.
Queralt, A., and E. Teniente. 2006. Reasoning on UML Class Diagrams with OCL
Constraints, Conceptual Modeling: ER , LNCS. Berlin: Springer.
Pace, D. K. 2000. Simulation conceptual model development. In Proceedings of the
Spring Simulation Interoperability Workshop. Retrieved November 2005 from
www.sisostds.org.
Porter, A. A., L. G. Votta, and V. R. Basili. 1995. Comparing detection methods for
software requirements inspections: A replicated experiment. IEEE transactions
on software engineering 21(6), 563–575.
Poseidon. 2006. UML Case Tool. Retrieved October 2006 from http://www.
gentleware.com/.
Rational. 2004. Rational case tool. Retrieved October 2006 from http://www-306.
ibm.com/software/rational/.
REVVA 2. 2005. VV&A process specification (PROSPEC) user’s manual, v1.3.
Retrieved October 2007 from http://www.revva.eu/.
Sargent, R. G. 2001. Some approaches and paradigms for verifying and validating
simulation models. In Proceedings of the 2001 Winter Simulation Conference.
Arlington, VA: ACM.
Schinz,I., T. Toben, C. Mrugalla, and B. Westphal. 2004. The rhapsody UML verifica-
tion environment. In Proceedings of Second International Conference on Software
Engineering and Formal Methods (SEFM). Beijing, China: IEEE.
SD Metrics. 2007. List of object oriented design rules. Retrieved December 2007 from
http://www.sdmetrics.com/LoR.html#LoR.
SEDEP. 2007. Synthetic Environment Development and Exploitation Process: Euclid
RTP 11.13. Retrieved December 2007 from http://www.euclid1113.com/.
Sourrouille, J. L., and G. Caplat. 2002. Constraint checking in UML modeling.
In Proceedings International Conference Software Engineering and Knowledge
Engineering (SEKE 2002). ACM.
Sourrouille, J. L., and G. Caplat. 2003. A pragmatic view on consistency checking
of UML models. In Workshop on Consistency Problems in UML-Based Software
Development II, Workshop Materials, ed. L. Kuzniarz, Z. Huzar, G. Reggio, J. L.
Sourrouille, and M. Staron. IEEE.
Statemate-Magnum. 2007. A case tool for UML verification. Retrieved April 2007 from
http://www.ilogix.com/products/magnum/index.cfm.
Tabu. 2004. Tool for the active behavior of UML. Retrieved April 2007 from http://
www.cs.iastate.edu/~leavens/SAVCBS/2004/posters/Beato-Solorzano-
Cuesta.pdf.
Taentzer, G. 2003. AGG: A graph transformation environment for modeling and val-
idation of software. In Proceedings of Application of Graph Transformations with
Industrial Relevance (AGTIVE’03). Springer.
UML-Based Conceptual Models and V&V 419
Tanrıöver, Ö. 2008. An ınspection approach for conceptual models of the mission space
in a domain specific notation. Unpublished PhD Thesis, Middle East Technical
University, Ankara, Turkey.
Tanrıöver, Ö., and S. Bilgen. 2007a. An inspection approach for conceptual models for
the mission space developed in domain specific Notations of UML. In Software
Interoperability Workshop Papers. Orlando, FL: SISO.
Tanrıöver, Ö., and S. Bilgen. 2007b. An inspection approach for conceptual models
in notations derived from UML: A case study. In Proceedings of Symposium on
Computer and Information Sciences. Ankara, Turkey: IEEE.
Travassos, G. H., F. Shull, J. Carver, and V. R. Basili. 2002. Reading techniques for
OO design inspections. University of Maryland Technical Report, April(OORT
V.3). Retrieved December 2007 from http://www.cs.umd.edu/Library/CS-TR-
4353/CS-TR-4353.pdf.
UML Superstructure. 2005. Unified Modeling Language 2.0 superstructure specifica-
tion. Object Management Group, retrieved December 2005 from http://www.
omg.org/uml/.
Unhelkar, B. 2005. Verification and Validation for Quality of UML 2.0 Models. Hoboken,
NJ: Addison Wesley.
Van der Straten, R., T. Mens, J. Simmons, and V. Jenkers. 2003. Using description logic
to maintain consistency between UML models. In Proceedings of UML 2003:
Model Languages and Applications. LNCS:2863. Springer.
Zhao,Y., X. Fan, Y. Bai, H. C. Vang, and W. Ding. 2004. Towards formal verification of
UML diagrams based on graph transformation. In Proceedings of the International
Conference on E-Commerce Technology for Dynamic E-Business. Beijing: IEEE.
Part V
Domain-Specific
Conceptual Modeling
16
Conceptual Modeling Evolution within US
Defense Communities: The View from the
Simulation Interoperability Workshop
Dale K. Pace
Contents
16.1 Introduction...............................................................................................423
16.2 Historical Background..............................................................................425
16.3 Conceptual Model Characteristics and Application Context.............430
16.4 Parallel Paths: RPG Simulation Conceptual Model and FEDEP
Federation Conceptual Model (FCM).....................................................433
16.4.1 Unmet Desire for a Prescriptive Approach..............................433
16.4.2 Functions of Federate and Federation Conceptual
Models........................................................................................... 436
16.4.3 Conceptual Model Content........................................................ 437
16.4.4 Conceptual Model Documentation Format............................. 439
16.5 SIW Conceptual Model Study Group.....................................................440
16.6 Persistent Problems...................................................................................442
16.6.1 Failure to Develop Explicit and Distinct Simulation-
Related Conceptual Models.......................................................442
16.6.2 Diversity of Applications............................................................444
16.6.3 Excessive Expectations for Simulation-Related
Conceptual Modeling.................................................................444
16.6.4 Resource Limitations..................................................................445
16.7 Final Comments and Conclusions..........................................................446
Appendix: Glossary.............................................................................................446
Acknowledgments............................................................................................... 447
References.............................................................................................................. 447
16.1 Introduction
Simulation-related conceptual modeling is a challenging and complex topic.
Insights can be gained about factors influencing development of conceptual
modeling ideas by examining the continuing evolution of simulation-related
423
424 Conceptual Modeling for Discrete-Event Simulation
* A glossary at the end of the chapter lists acronyms used in the chapter.
Conceptual Modeling Evolution within US Defense Communities 425
European SIW (with 25–100 presentations) whose papers and discussions are
fully integrated into SISO.
Three main streams of conceptual modeling stimulated by DMSO/M&S
CO have interacted with one another, both within SIW and elsewhere, some-
times in competitive ways and sometimes broadening and honing ideas and
concepts for all. One stream is the Conceptual Model of the Mission Space
(CMMS), later renamed Functional Description of the Mission Space (FDMS).
A second stream is simulation conceptual modeling (SCM) as expressed in
the DoD Recommended Practices Guide (RPG) for modeling and simulation
(M&S) verification, validation, and accreditation (VV&A). The third stream is
development of the Federation Conceptual Model (FCM), a conceptual model
for a collection of simulation applications working in concert as embodied in
the DMSO HLA Federation Development and Execution Process (FEDEP).*
This chapter focuses (1) on simulation-related conceptual modeling ideas
reflected by these three streams, most of which were discussed extensively
within SIW, and (2) on where simulation-related conceptual modeling ideas
are within the simulation communities of SIW in early 2009. Even though it
causes a bit of repetition in the chapter, development of conceptual model-
ing ideas in each of these three streams is treated individually. Then how
they merge is discussed. Some of the material mentioned in the following
background section presage points that are addressed more fully later in
the chapter. Material is presented in this way for reader convenience. It
allows each of the chapter sections to be coherent without dependence upon
material in the other sections of the chapter.
The next section of this chapter presents historical background about SCM
evolution within US Defense communities. The section after that addresses
conceptual model implications of the application context. Then a section will
examine the parallel approaches to conceptual modeling by the RPG and
the FEDEP. That is followed by a section that considers what has been done
in the SIW Simulation Conceptual Modeling Study Group (SCM SG) and its
evolution into a SIW Standing Study Group (SSG). A number of persistent
problems related to conceptual modeling are identified and discussed before
conclusions and final comments are presented.
business since long before the days of computer simulation was the basis for
development of most simulations within the US military and Defense com-
munity. This approach involved five basic steps:
Space (FDMS), which had the same meaning as CMMS, in order to emphasize the functional
(vice “conceptual”) nature of the simulations (as desired by the operational community). This
also reduced confusion between FDMS and “conceptual model,” whether conceptual model
was applied to a single simulation (federate) or to a collection of simulations functioning
together as a federation in a distributed simulation.
428 Conceptual Modeling for Discrete-Event Simulation
By the late 1990s, there was considerable confusion about conceptual models
as they pertain to simulation. There were four main reasons for the con-
fusion. First was use of the words conceptual model in CMMS, which was
concerned with abstraction of a military mission space from authoritative
sources in EATI terms. Second was the idea of the conceptual model as the
link between simulation requirements and design, as it was being developed
for the Web-based Recommended Practices Guide (RPG) for M&S VV&A by a
team under DMSO direction. Third was the conceptual model idea being
developed for the FEDEP to standardize development processes for HLA
federations because it was realized that although the Federation Object
Model (FOM) could ensure communication compatibility within the federa-
tion, it did not ensure representational compatibility among the federates.
The FCM became the mechanism to ensure representational compatibility
within a federation. All three of these simulation-related conceptual model
ideas were percolating within the DMSO and SIW* communities. Fourth, in
addition, other ideas about conceptual modeling, such as the database-ori-
ented ones associated with the Journal of Conceptual Modeling, were also being
communicated and discussed. This confusing variety of connotations for the
term conceptual model continues to this day (Druid et al. 2006). Resolution of
the differences in connotations for simulation-related conceptual modeling
becomes easier when application context is brought into the picture, as will
be done later in this chapter.
Timelines can provide perspective on conceptual modeling related to simu-
lation, at least within the US military arena. All work under DMSO direction
was oriented primarily toward simulation by or for US Defense communities.
HLA had been directed by senior DoD leadership to be the architecture for
distributed simulation within DoD (Kaminski 1996). Work on CMMS began
in the mid-1990s, and as noted earlier the name changed to FDMS about
2000 but the concept stayed the same. The CMMS/FDMS idea was migrat-
ing to the Knowledge Integration Resource Center (KIRC) about 2002, which
seems to have been where FDMS went after trying to integrate FDMS with
the Defense Modeling and Simulation Resource Center. Personnel changes
and funding decreases caused the DMSO-sponsored CMMS/FDMS effort
to atrophy. The most substantial continuation of the FDMS idea directly
seems to have been done at the Swedish Defense Research Agency (FOI).
Their evolution of the FDMS idea is called the Defence Conceptual Modeling
Framework (DCMF) (Kabilan and Mojtahed 2006).
* SIW is the primary venue where various parts of IEEE Standard 1516 relative to HLA imple-
mentations were developed prior to their balloting for acceptance as an IEEE standard.
Conceptual Modeling Evolution within US Defense Communities 429
* The FEDEP extends beyond US military simulation since it is the subject of a NATO standard-
ization agreement, STANAG 4603: Modeling and Simulation Architecture Standards for Technical
Interoperability: High Level Architecture (HLA).
† Simone Youngblood also has chaired the SIW VV&A Forum since SIW began. Previously she
based) for the M&S development or modification may be developed from the
conceptual model. This kind of conceptual model permits a simulation design
that fully captures the M&S requirements so that the simulation will have
the capability to satisfy simulation objectives in its intended applications.
For a legacy simulation that was constructed without an explicit concep-
tual model, the simulation conceptual model has a different function. It uses
a full description of the M&S implementation to create the simulation con-
ceptual model so that assessments of the appropriateness (or limitations) of
M&S applications may be determined from the simulation conceptual model
for situations that are not tested directly. For such a legacy simulation, the
simulation conceptual model provides a solid basis for decisions about mod-
ifications to the simulation.
Figure 16.1 from the RPG illustrates these two perspectives on simulation
conceptual models, one for new simulations or simulation modifications, and
the other for legacy simulations developed without an explicit conceptual
model. As shown in Figure 16.1, the simulation conceptual model has three
primary components: the simulation context, the mission space, and the sim-
ulation space. “The simulation context provides authoritative information
about the user and problem domains to be addressed in the simulation based
on the M&S requirements of the intended application” (RPG Build 3 2006).
Thus, most of the CMMS/FDMS information pertinent to a simulation as the
“first abstraction of the real world” becomes part of the simulation context
in the RPG simulation conceptual model. This information establishes
Fig ur e 16.1
Simulation conceptual model components.
432 Conceptual Modeling for Discrete-Event Simulation
constraints and boundary conditions for the simulation concept that guides
articulation of the detailed specifications that focus the simulation design.
In early discussions of the RPG simulation conceptual model, questions
were raised about the need for both mission space and simulation space parts of
the conceptual model. It was decided that it would be very helpful to sepa-
rate the representational aspects of the conceptual model (i.e., the mission
space) from those aspects of the conceptual model concerned with simulation
implementation (i.e., the simulation space). This separation of representational
aspects of the conceptual model from implementation aspects (such as being
required to run on particular hardware or operating systems, run in real-time
or some multiple of real-time, support particular kinds of display systems,
etc.) has proved very helpful, especially when representational compatibil-
ity of various simulations (federates) to be used in a distributed simulation
(federation) has to be assessed. It permits focus on representational issues:
what level of resolution and accuracy is expected from the representation,
what elements are treated explicitly within the conceptual model, what are
the assumptions and pedigrees of the algorithms selected, etc.
In the spirit of maximizing implementation independence of the concep-
tual model, the simulation space is restricted to implementation implications
from the requirements that the simulation must satisfy. Other implementa-
tion decisions about the simulation are left to simulation design and are not
part of the simulation conceptual model; although in practice simulation con-
ceptual models often have design elements included in them beyond what
is essential to satisfy M&S requirements fully. Obviously the kind of simula-
tion (live, virtual, or constructive)* as well as the amount of implementation
related requirements will have major impacts on the extent of material in the
simulation space portion of the simulation conceptual model.
In diagrams of earlier versions of the FEDEP (HLA versions 1.x prior to
publication in IEEE Std 1516.3) (Lutz 2003), there seemed to be a significant
difference in connotation for the FCM and the simulation conceptual model
of the RPG because the FCM came before federation requirements and drove
them in a sense, whereas the simulation conceptual model of the RPG comes
after simulation requirements and is driven by them. However, the differ-
ence was only apparent and not a real difference in substance. The apparent
difference arose because the FEDEP used the term “federation objectives”
(which preceded the FCM and drove it as input to it) in the way that the RPG
used “M&S requirements” (which drive the simulation conceptual model).
The FEDEP used the term “federation requirements” (which followed the
FCM and were shown in “old FEDEP” diagrams as an output from the FCM)
* In the early mid 1990s, US defense simulation communities began to use the terms live, virtual,
and constructive to indicate aspects of simulation implementation. Live meant actual military
forces and systems such as tanks, aircraft, ships, and personnel were involved in the simula-
tion. Virtual meant simulators such as the simulators used to train aircraft or tank crews were
involved in the simulation. Constructive meant the simulation was contained completely in
computer code and did not involve either real systems or simulators.
Conceptual Modeling Evolution within US Defense Communities 433
in the way that the RPG used “M&S specifications” (the detailed informa-
tion that enables a simulation design to be developed that fully satisfies M&S
requirements).
The current version of the FEDEP (in IEEE Standard 1516.3-2003) removes
the “apparent” difference by having conceptual analysis that is driven by
federation objectives lead to the FCM, which in turn leads to federation
design. “Federation requirements” come out of conceptual analysis in paral-
lel to the conceptual model and drive assessment of federation results. This
makes the conceptual model connotations for the simulation conceptual
model (oriented toward federates) compatible with the connotation for the
FCM, which addresses a collection of federates used together in a federation.
The conceptual model in the FEDEP is a description of “what the [simulation
or federation] will represent, the assumptions limiting those representations,
and other capabilities needed to satisfy the user’s requirements” (IEEE Std
1516.3-2003).
The basic function of the simulation-related conceptual model has been
clearly identified as (1) the link between objectives/requirements and sim-
ulation specifications/design and (2) a vehicle for effective communication
among the various simulation stakeholders and other interested parties.
This is true both for individual simulations (federates) and for combinations
of simulations (federations).
Conceptual modeling issues identified below, which need consideration
once simulation-related conceptual modeling functions are established, will
be addressed later in this chapter:
Table 16.1
Selected Defense Simulation Varieties and Diversities
Characteristic One Extreme Another Extreme
Time/Progress Method Continuous Simulation Discrete-Event Simulation
Facility/Run-time “Live” with military systems Computer-code only simulation
Constraints & people in the loop that with no run-time constraints
must be run in real-time
Level of Detail/ First principle physics code Simulation with aggregate
Aggregation representation of large (theater
level) military forces
Simulation Application Simulation of policy, financial, Simulation of product flow on a
Domain or personnel matters manufacturing floor
Mechanisms Represented Simulation of atmospheric Simulation of disease
transport and diffusion for progression in humans at
chemical or biological agent various levels of resolution
Capacity for Human Training simulator for Constructive simulation with
Interaction with the helicopter pilots with no interactive capabilities
Simulation While It Runs interactive visualization
Note: (The spectrum of simulations shown here illustrates why a prescriptive approach to con-
ceptual model development is beyond current capabilities if the approach is to apply to
all simulation varieties. Potential simulation differences indicated would impact signifi-
cantly a conceptual model for the simulation. These differences affect how the concep-
tual model should be developed, what it should contain, and how it should be
documented. For example, the conceptual model for a missile seeker simulation with a
facility, such as an anechoic chamber, has to address control of facility temperature and
humidity so that reliable results can be obtained. Such capabilities are not required for
constructive simulations that are comprised only of computer software.)
436 Conceptual Modeling for Discrete-Event Simulation
users and analysts’ manuals, etc., if the information were documented at all.
Existence of an explicit conceptual model creates a particular item where
anyone involved with simulation development or use can find the informa-
tion needed for assessment of simulation appropriateness for a particular
application, whether used as an individual simulation by itself or in a federa-
tion with other simulations. Determination of compatibility among federates
in a federation is always an important consideration.
Table 16.2
Example List of Information Included in a Simulation Conceptual Model
1) Simulation descriptive information
• model identification (e.g., version and date)
• POCs
• model change history
2) Simulation context (per intended application)
• purpose and intended use statements
• pointer to M&S requirements documentation
• overview of intended application
• pointer to FDMS and/or other sources of application domain information
• constraints, limitations, assumptions
• pointer to referent(s) and referent information
3) Simulation concept (per intended application)
• mission space representation (simulation elements & simulation development description)
• simulation space functionality
4) Simulation elements, including
• entity definitions (entity description, states, behaviors, interactions, events, factors,
assumptions, constraints, etc.)
• process definitions (process description, parameters, algorithms, data needs,
assumptions, constraints, etc.)
5) Validation history, including
• M&S requirements and objectives addressed in V&V effort(s)
• pointer to validation report(s)
• pointer to simulation conceptual model assessment(s)
6) Summary
• existing conceptual model limitations (for intended application)
• list of existing conceptual model capabilities
• conceptual model development plans
Source: Based on Department of Defense Modeling and Simulation Coordination Office, VV&A
Recommended Practices Guide, RPG Build 3.0, September 2006.
make it both much more difficult to develop an appropriate FCM, and more
difficult to perform V&V on the federation. This is because information
needed to support assessment of representational capability of federates and
to support V&V of the federation may be difficult or impossible to discover
for federates without well-done conceptual models for those federates.
* A SISO study group typically is established for a limited time and is expected to evolve into
a product oriented group or into a Standing Study Group (SSG). A SSG within SISO is “estab-
lished to represent a specific community or national group, to mature a potential standard …
SSGs may have an indefinite life span.” (SISO Web site, http://www.sisostds.org).
442 Conceptual Modeling for Discrete-Event Simulation
* For example, the Modeling and Simulation Professional Certification Commission (M&SPCC)
under the auspices of the National Training and Simulation Association (NTSA) is involved
in such.
444 Conceptual Modeling for Discrete-Event Simulation
that a simulation conceptual model can provide (especially for those areas
in which data are limited or lacking) will create the financial incentive for
always having an explicit conceptual model as an artifact of simulation
development.
Successful simulation developments and modifications with explicit and
distinct conceptual models demonstrate viability of including such concep-
tual models in simulation development and modification. It is time for simu-
lation developers to move beyond the simulation development equivalent of
spaghetti code, which is an appropriate analogy for simulations developed
without explicit conceptual models.
Appendix: Glossary
AIAA American Institute of Aeronautics and Astronautics
ASME American Society for Mechanical Engineers
CMMS Conceptual Model of the Mission Space
DCFM Defence Conceptual Modeling Framework
DEVS Discrete Event System Specification
DIS Distributed Interaction Simulation
DMSO Defense Modeling and Simulation Office
DoD United States Department of Defense
DSB Defense Science Board
Conceptual Modeling Evolution within US Defense Communities 447
Acknowledgments
The author appreciates helpful and constructive reviews of draft material for
this chapter by Simone Youngblood and Jack (“Jake”) Borah.
References
About SISO. 2009. Simulation Interoperability Standards Organization (SISO) Web
site, http://www.sisostds.org, accessed on March 18, 2010.
AIAA. 1998. American Institute for Aeronautics and Astronautics AIAA G-077, Guide for
Verification and Validation of Computational Fluid Dynamics Simulation.
448 Conceptual Modeling for Discrete-Event Simulation
ASME. 2006. American Society for Mechanical Engineers Guide V&V 10—2006, Guide
for Verification and Validation in Computational Solid Mechanics.
Blair, M., and M. Love. 2002. Scenario based affordability assessment tool. NATO
RTO AVT Symposium on Reduction of Military Vehicle Acquisition Time and Cost
through Advanced Modelling and Virtual Simulation, 22–23 April 2002, Paris, France
(RTO-MP-089).
Borah, J. 2005. Simulation Conceptual Modeling (SCM) final report, submitted to
SISO Standards Activities Committee. SISO Web site, http://www.sisostds.org,
accessed on March 18, 2010.
Borah, J. 2003. Simulation conceptual modeling study group gets rolling. Simulation
technology magazine 6(3).
Borah, J. 2002. Conceptual modeling—How do we move forward?—The next step.
Paper 054 in Fall 2002 Simulation Interoperability Workshop.
Department of Defense Modeling and Simulation (M&S) Master Plan, DoDD
5000.59-9P, October 1995.
Department of Defense Modeling and Simulation Coordination Office. 2006. VV&A
Recommended Practices Guide, RPG Build 3.0. http://vva.msco.mil/ (accessed
January 26, 2009).
DMSO. 1999. Defense Modeling and Simulation Office, High Level Architecture
Federation Development and Execution Process (FEDEP) Model, Version 1.5,
December 8, 1999.
Distributed Interactive Simulation (DIS). 1995. A glossary of modeling and simulation
terms for distributed interactive simulations. Institute for Simulation and Training,
University of Central Florida.
Druid, L., K. Johansson, P. Stahl, and P. Asplund. 2006. Methods and tools for sim-
ulation conceptual modeling, Paper 06E-SIW-029 in 2006 European Simulation
Interoperability Workshop.
Firat, C. 2001. A knowledge based look at Federation Conceptual Model develop-
ment. Paper 024 in European 2001 Simulation Interoperability Workshop.
Heavey, C., and J. Ryan. 2006. Process modelling support for the conceptual model-
ling phase of a simulation project. In Proceedings of the 2006 Winter Simulation
Conference, 801–808.
Hollenbach, J. W. and W. L. Alexander. 1997. Executing the modeling and simula-
tion strategy, making simulation systems of systems a reality. Winter Simulation
Conference 1997 Proceedings.: 948–954.
IEEE Std 1516.3-2003, Recommended Practice for High Level Architecture (HLA)
Federation Development and Execution Process (FEDEP).
IEEE Std 1516.4-2007. 2007. IEEE Recommended Practice for Verification, Validation,
and Accreditation of a Federation: An Overlay to the High Level Architecture
Federation Development and Execution Process.
Kabilan, V., and V. Majtahed. 2006. Introducing DCMF-O: Ontology suite for defence
conceptual modelling. Paper 028 in European 2006 Simulation Interoperability
Workshop.
Kaminski, P. 1996. DoD USD (AT&L) Memorandum, subject: DoD High-Level
Architecture (HLA) for Simulations, September 10, 1996.
Krut, T., and N. Zahman. 1996. Domain Analysis Workshop Report for the Automated
Prompt Response System Domain, Special Report CMU/SEI-96-SR-001, May 1996.
Conceptual Modeling Evolution within US Defense Communities 449
Lewis, R. O., and G. Q. Coe. 1997. A comparison between the CMMS and the con-
ceptual model of the federation. Paper97F-SIW-001 in Fall 1997 Simulation
Interoperability Workshop.
Lutz, B. 2003. IEEE 1516.3: The HLA Federation Development and Execution Process
(FEDEP). Paper 03E-SIW-022 in 2003 European Simulation Interoperability
Workshop.
Mayer, R. J. and R. E. Young. 1984. Automation of simulation model generation from
system specifications. In Proceedings of the 1984 Winter Simulation Conference,
570–574.
Metz, M. 2000. Comparing the Joint Warfare System (JWARS) conceptual model to
a conceptual model standard. Paper 129 in Fall 2000 Simulation Interoperability
Workshop.
Mojtahed, V., M. Lozano, P. Svan, et al. 2005. DCMF: Defence Conceptual Modeling
Framework. Swedish Defence Research Agency (FOI) Report FOI-R--1754–SE,
November 2005.
NASA Tech Briefs. 2006. Adaptive Modeling Language and Its Derivatives. June 1, 2006.
http://www.techbriefs.com/content/view/116/34/ (accessed August 2009).
Oren, T. I. 2005. Toward the body of knowledge of modeling and simulation. Paper
2025, Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC)
2005.
Pace, D. 2001. Impact of federate conceptual model quality and documentation
on assessing HLA federation validity. Paper 014 in European 2001 Simulation
Interoperability Workshop.
Pace, D. 2000. Conceptual model development for C4ISR simulations. In 5th
International Command and Control Research and Technology Symposium (ICCRTS),
24–26 October 2000, Canberra, Australia.
Pace, D. 1999. Development and documentation of a simulation conceptual model.
Paper 017 in Fall 1999 Simulation Interoperability Workshop.
Robinson, S. 2006a. Issues in conceptual modelling: Setting a research agenda. In
Proceedings of the Third Operational Research Society Simulation Workshop (SW06),
165–174. Birmingham, UK: Operational Research Society.
Robinson, S. 2006b. Conceptual modeling for simulation: Issues and research require-
ments. In Proceedings of the 2006 Winter Simulation Conference, 792–800.
Schlesinger, S. 1979. Terminology for model credibility. Simulation 32(3): 103–104.
Sheehan, J., T. Prosser, H. Conlay, et al. 1998. Conceptual Models of the Mission Space
(CMMS): Basic concepts, advanced techniques, and pragmatic examples. Paper
127 in Spring 1998 Simulation Interoperability Workshop.
Tanriover, R., and S. Bilgen. 2007. An inspection approach for conceptual models in
notations derived from UML: A case study. In 22nd International Symposium on
Computer and Information Sciences, 2007, International Symposium on Computer and
Information Sciences (ISCIS) 2007.
Zeigler, B. P. 1999. A theory-based conceptual terminology for M&S VV&A. Paper 064
in Spring 1999 Simulation Interoperability Workshop.
17
On the Simplification of Semiconductor
Wafer Factory Simulation Models
Contents
17.1 Introduction............................................................................................... 452
17.2 Related Work............................................................................................. 452
17.2.1 Basis of the New Approach.......................................................454
17.2.2 Predicting Fab Behavior Over Time......................................... 455
17.2.3 Predicting Cycle Times.............................................................. 455
17.3 New Approaches: An Introduction........................................................ 456
17.3.1 Complex Model........................................................................... 456
17.3.2 Required Characteristics for Calibrating the
Simple Model............................................................................... 457
17.3.3 Computing Distributions.......................................................... 458
17.4 Predicting the Characteristic Curve Using the Delay Approach...... 458
17.4.1 Characteristic Curve................................................................... 459
17.4.2 Cycle Time Distributions........................................................... 460
17.5 Predicting the Cycle Time Distribution with the Interarrival
Time Approach.......................................................................................... 460
17.5.1 Modeling Interarrival Times..................................................... 461
17.5.1.1 Interarrival between All Lots.................................. 461
17.5.1.2 Interarrival between Lots with All
Combinations of Products........................................ 462
17.5.1.3 Interarrival between Lots of the Same Product.... 462
17.5.2 Cycle Time Distribution.............................................................464
17.5.3 Characteristic Curve................................................................... 465
17.5.4 Minimizing Adjustment Time.................................................. 465
17.5.4.1 Nonlinear Regression............................................... 465
17.5.4.2 Regression over Regression...................................... 466
17.5.4.3 Application of the Methods...................................... 466
17.5.5 Modeling Overtaking................................................................. 466
17.5.6 Disadvantages............................................................................. 467
17.6 Conclusions................................................................................................ 469
References.............................................................................................................. 469
451
452 Conceptual Modeling for Discrete-Event Simulation
17.1 Introduction
In the globalized world with competition and the focus on productivity, the
main goal in semiconductor wafer manufacturing is to maximize the output
of the fabrication facilities (in short, wafer fabs) under due date constraints.
In such an optimized environment, it is essential to have good prediction of
fab behavior after a breakdown of critical machines and how to meet the due
dates in such a situation. To that end, simulation is a very important tool.
Simulation can also be used to test various parameters of machines to find
better settings (e.g., as in Rose 2003).
Modeling manufacturing environments in all of its details result in com-
plex models, which have the big disadvantage that simulation studies can
become very time consuming. Therefore, we developed several approaches
to build simple models that lead to shorter simulation runtimes.
It is abundantly clear that a high degree of simplification of a complex fab
model leads to relatively high deviations of the interesting fab characteristics
but also to a very short simulation time (e.g., Hung and Leachman 1999). In
contrast to this, a low degree of simplification leads to low deviations in the
characteristics but a higher simulation time.
When designing a simulation conceptual model, the model should repro-
duce the behavior of the real world related to the model objectives (Robinson
2008). Objectives are, for example, to maximize the throughput or to mini-
mize the inventory level in a fab. Consequently, the simulation model must
mimic the behavior of the real world concerning these characteristics to a
sufficient degree. In addition, the model should be as simple as possible to
reduce the development and simulation time. This chapter focuses on models
with a very high degree of simplification and we show different methods to
reach a sufficient degree of accuracy concerning different model objectives.
In the next section, we give an overview about the already existing approaches,
both with a high and a low degree of simplification. In the third section, we
describe the complex wafer fab model, which is used to estimate the parameters
of the simple models, and the procedure to calibrate the simple models. Then
we present two variants of a simple model. The first one focuses on a good pre-
diction of the average lot cycle times whereas the second focuses on predicting
lot cycle time distributions. We also provide some results and discuss the limi-
tations of the approaches. In the last section, we give a conclusion.
simplifications of existing models. To that end the modeler removes scope and
detail from a given model or represents its components in a simplified manner.
Most authors work with the one or more of the following approaches: removing
unimportant components of the model, using random variables to replace parts
of the model, considering less detail for the range of variables in the model, and
combining components of the model into new and simpler components.
The main idea of our approach to reduce the complexity of the model for a
semiconductor wafer fab is to replace machines in the fab with delay elements,
i.e., with random variables representing large parts of the production plan of
a lot (e.g., Hung and Leachman 1999). This means to model a few machines in
detail and to replace the processing steps a lot would take at deleted machines
by dummy machines. These dummy machines mimic the delay the lots would
need at the replaced steps. The main issue is to delete as many machines as
possible with the objective to reduce the complexity of the fab and to mini-
mize the number of operations during simulation while the simple model
behaves as much as possible as the complex one. Highly utilized machines
have a big influence on the fab and the behavior of the lots, and should not be
considered for deletion. In Jain and Lim (1999b), different levels of detail have
been tested. One idea is to model only the bottleneck machine group in detail.
Alternatively more highly utilized machines can be modeled if the accuracy
concerning the behavior of the simple and the complex model are inadequate.
Our approaches focus on the high degree of simplification bottleneck only
modeling approach to maximize the savings in simulation time.
There are different ways to set the processing times of the dummy machines.
Rose (1998) uses exponential distributions, whereas Hung and Leachman (1999)
try static values and quartile-uniform distributions. The quartile-uniform vari-
ant assumes the distribution is uniform between quartile points. Calculating
delay values using this method is faster than the distribution method of Rose.
Our approaches in this chapter use Rose’s variant because the exponential dis-
tributions match the distributions in the complex model at best.
The next problem is the dynamic adaptation to different utilizations of the
fab. Peikert, Thoma, and Brown (1998) try to adapt the dummy delays by cal-
culating the raw processing time of the dummy steps and multiplying it with
the flow factor (cycle time divided by the raw processing time) of the lots at
a specific utilization. This method may be acceptable, but in scenarios where
the different parts of the fab have differently utilized machines, this approach
may influence the results to a high degree. Further dynamic adaptation
approaches are not available in the literature so far. In this chapter, we extend
Rose’s approach with dynamic distributions to adapt to different release rates
of lots into the fab, or in a more general sense to different fab workloads.
The intention of the application of simple models is to match some or ide-
ally all important fab performance characteristics of the complex model. In
the related work, different types of simple models are tested with regard to
some of these characteristics. Jain, Lim, and Low (1999a) split a fab into a few
independent parts and use the bottleneck only approach for each part. They
454 Conceptual Modeling for Discrete-Event Simulation
focus on the average cycle time and compare a detailed fab model with their
reduced model. The simple models fail to approximate the average cycle time.
Peikert, Thoma, and Brown (1998) also use a bottleneck-only approach. Their
interest is to assist the design of a wafer fab with the aim to understand fab’s
behavior and to optimize operator deployment and the usage of dispatch
rules. Hung and Leachman (1999) use different levels of model reduction
and predict the cycle time of lots with an acceptable accuracy.
However, none of these approaches provides a sufficient adaptation to a
change in lot release rates. In this chapter, we extend the previous approaches
of one of the authors to overcome this weakness. This approach is as simple
as possible and provides a high simulation speed-up. The related work can
be found in Rose (1998, 1999, 2000a, 2000b, and 2002) and will be described in
the following sections.
Delay loop
Fig ur e 17.1
Simple model with delay distribution in the loop.
On the Simplification of Semiconductor Wafer Factory Simulation Models 455
500
FIFO
495 SPTF
Critical ratio
490 Slack time
485
WIP
480
475
470
465
460
2000 2500 3000 3500 4000
Time
Fig ur e 17.2
WIP evolution.
1.9
Complex model
1.8
Simple model
1.7
1.6
Flow factor
1.5
1.4
1.3
1.2
1.1
0 0.2 0.4 0.6 0.8 1
Workload
Fig ur e 17.3
Characteristic curve.
Delay loop
Lots in loop
Fig ur e 17.4
Simple model with number of lots in the loop.
real wafer fab. There are some limitations concerning the operators of the
bottleneck machine group. If this group is not only responsible for the bot-
tleneck machines, the operators have to be deleted due to the nonavailabil-
ity of other machines, other than these bottleneck machines in the simple
model. This procedure is necessary in our case. One solution to avoid this
problem could be to restrict a particular group of operators in the com-
plex model to the bottleneck machine group. If this is the case, they can
be modeled with all characteristics in the simple model, too. Alternatively,
it may be possible to compute availability times of operators at the bottle-
neck (depending on the utilization of the fab) and use these values to adjust
operator characteristics in the simple model. However, we did not consider
these alternatives. The second limitation is that products that are not pass-
ing through the bottleneck must be deleted. In our fab this applies to three
products so that six products remain in our fab model. Deleting products
in a fab, results in different characteristics of the fab, e.g., in different cycle
times of all lots due to the lower utilization. Therefore, we have deleted
these three products in our complex fab to solve this problem. A possible
alternative could be that additional workcenters have to be included into the
simple model.
180
160
140
120
Lots in loop
100
80
60
40
20
0
0 1000 2000 3000 4000 5000 6000 7000 8000
Simulation time (days)
Fig ur e 17.5
Single run to generate the delay distributions.
1.9
Complex model
1.8
Simple model w/ static distributions
1.7 Simple model w/ dynamic distributions
1.6
Flow factor
1.5
1.4
1.3
1.2
1.1
1
0 0.2 0.4 0.6 0.8 1
Workload
Fig ur e 17.6
Characteristic curve.
17.4.1 Characteristic Curve
The characteristic curve is shown in Figure 17.6. Concerning the mean cycle
time, the new approach matches the complex fab model to a high degree.
Only at high workloads the deviation slightly increases.
460 Conceptual Modeling for Discrete-Event Simulation
0.35
Complex model
0.3 Simple model
0.25
Relative quantity
0.2
0.15
0.1
0.05
0
10 12 14 16 18 20 22 24 26 28
Cycle time (days)
Fig ur e 17.7
Cycle time distribution.
the dummy queue and the machine after this queue releases the lots depend-
ing on the interarrival time characteristics at the bottleneck queue. Instead
of delay distributions, we now have to determine the appropriate array of
interarrival time distributions of lots from the loop at the queue of the bottle-
neck machine group. This approach still uses dynamic distributions. This
is necessary because the cycle time estimates of an interarrival time model
without dynamic adaptation to the workload will not match the expected
ones. The model will run full or low on lots. This is caused by the indirect
modeling of the delay in the loop using interarrival times. The interarrival
times change significantly with a different workload of the fab whereas the
delay in the loop changes only slightly. We have chosen an interval width
of one lot in loop for our experiments because this approach is much more
sensitive to deviations in the distributions than the delay approach.
Lot
To t t t t
bottleneck
Distribution
Fig ur e 17.8
Interarrival between all lots.
462 Conceptual Modeling for Discrete-Event Simulation
Lot
From loop
Prod Prod Prod Prod Prod
type 1 type 3 type 4 type 4 type 1
t t t t
To
bottleneck
Distributions
Prod type 1
Prod type 2
Prod type 3
Prod type 4
Prod type 1
Prod type 2
Prod type 3
Prod type 4
Fig ur e 17.9
Interarrival between lots with all combinations of products.
Lot
t
To
bottleneck t
From
bottleneck
DistributionProd type 4
Fig ur e 17.10
Interarrival between lots of the same product.
0.35
Complex model
0.3 Simple model (version 1)
Simple model (version 3)
0.25
Relative quantity
0.2
0.15
0.1
0.05
0
14 16 18 20 22 24 26 28 30 32
Cycle time (days)
Fig ur e 17.11
Absolute cycle time distribution.
6
Complex model
5 Simple model (version 1)
Simple model (version 3)
Relative quantity
0
1.2 1.4 1.6 1.8 2 2.2
Flow factor
Fig ur e 17.12
Relative cycle time distribution.
Table 17.1
Deviation of the Expectation Value, the Variance and the Relative
Divergence of the Expectation Values
Product Moment Version 1 Version 3
All Products Expectation −0.58 0.78
Variance −0.60 3.73
Relative deviation –2.8% 3.8%
Product 1 Expectation 2.04 1.61
Variance −0.11 6.56
Relative deviation 11.9% 9.4%
Product 2 Expectation −1.07 0.23
Variance −0.08 0.88
Relative deviation −5.7% 1.2%
Product 3 Expectation −0.76 2.21
Variance −0.09 6.45
Relative deviation −4.0% 11.7%
Product 4 Expectation −0.87 0.02
Variance −0.15 1.06
Relative deviation −4.2% 0.1%
Product 5 Expectation 1.24 1.14
Variance 0.16 1.75
Relative deviation 5.3% 4.9%
Product 6 Expectation −1.92 1.51
Variance 0.04 1.80
Relative deviation −7.3% 5.7%
f(x) = −ae−bx + 1, a, b, x ∈ ℜ+
1.9
1.8 Complex model
Simple model (delay)
1.7 Simple model (version 1)
Simple model (version 3)
1.6
Flow factor
1.5
1.4
1.3
1.2
1.1
1
0 0.2 0.4 0.6 0.8 1
Workload
Fig ur e 17.13
Characteristic curve.
7
Complex model
6 Simple model
Simple model w/ overtaking
5
Relative quantity
0
1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9
Flow factor
Fig ur e 17.14
Simple model with modeled overtaking.
17.5.6 Disadvantages
The main problem of the interarrival time approach is that the delay of the
lots in the loop are indirectly modeled with interarrival times. The aver-
age delay of a lot in the loop can be estimated by multiplying the number
of lots in the loop with the average interarrival time. There are scenarios
where the loop runs low on lots (e.g., if a big bottleneck breakdown occurs).
Consequently, the delay in the loop is low. In this case the lots accumu-
late at the queue of the bottleneck. In general, this is not a problem but in
the case of a non-FIFO order of lots at the bottleneck queue some lots may
speed-up and leave the fab soon. This leads to a number of finished prod-
ucts that is considerably higher than in the complex model. One example
is shown in Figure 17.15 for the CR dispatch rule for different target flow
factors (the mean cycle time is 1.66). This is a scenario where the interar-
rival time approach did not work. For such investigations the delay-based
approach should be used.
468 Conceptual Modeling for Discrete-Event Simulation
Table 17.2
Deviation of the Expectation Value, the Variance and the Relative
Divergence of the Expectation Values: Overtaking Included in Modeling
Product Moment w/o Overtaking w/Overtaking
All Products Expectation −0.58 0.20
Variance −0.60 −1.34
Relative deviation −2.8% −1.0%
Product 1 Expectation 2.04 0.42
Variance −0.11 −0.14
Relative deviation 11.9% 2.4%
Product 2 Expectation −1.07 −0.26
Variance −0.08 0.12
Relative deviation −5.7% −1.4%
Product 3 Expectation −0.76 0.26
Variance −0.09 −0.18
Relative deviation −4.0% −1.4%
Product 4 Expectation −0.87 0.01
Variance −0.15 −0.02
Relative deviation −4.2% 0.0%
Product 5 Expectation 1.24 0.77
Variance 0.16 −0.02
Relative deviation 5.3% 3.3%
Product 6 Expectation −1.92 −1.62
Variance 0.04 0.32
Relative deviation −7.3% −6.2%
165
160 FIFO
CR (target FF: 1.46)
155 CR (target FF: 1.66)
Simulation time (days)
145
140
135
130
125
120
140 160 180 200 220 240 260 280 300 320
WIP
Fig ur e 17.15
Different dispatch rules during a big breakdown.
On the Simplification of Semiconductor Wafer Factory Simulation Models 469
17.6 Conclusions
In this chapter, we presented a simple model for semiconductor wafer fabs
or similar fab types where a reentrant lot flow occurs. These approaches can
help to reduce the complexity of conceptual models. We introduced two new
approaches solving several problems of previous simple model approaches.
In particular, concerning the characteristic curve (depicting the flow factor
over utilization of the fab and the products’ cycle time distributions, the new
approaches were considerably better. In addition, we developed a method
to shorten the model calibration time, which facilitates the application of a
large number of distributions that might be necessary for some versions of
the simple models.
The two approaches can also be used to model parts of a fab. Perhaps if the
behavior of the complete fab is too complex to be modeled with only one sim-
ple model, the discussed techniques can be combined to model the complete
fab. For example if the delay of the first loops defer to a high degree from
later loops, the fab can be modeled with two or more simple models that are
connected in series. Alternatively, each loop could be modeled separately.
The discussed overtaking techniques can also be modified to achieve a bet-
ter mimic of the complex fab. For example the overtaking behavior between
different loops can be investigated. However, this makes the simple model
more complex and the adjustment and simulation time will increase.
Despite the successful improvements in modeling several fab characteris-
tics, there are still issues left; especially the behavior of the interarrival time
approach. For example the delay approach leads to very good performance
predictions in the case of a changed product mix, whereas the interarrival
approach might fail. It is also important which of the interarrival versions is
used. The first version turns out to be very good in most of the scenarios. The
second one might be better but the long adjustment time is a big disadvantage.
The interarrival approach needs distributions with a high degree of accuracy
whereas this is not very important for the delay approach. Furthermore, the
application of a non-FIFO dispatch rule is currently not possible with the
interarrival approach. In contrast to this the delay approach can be used with
other dispatch rules.
References
Fowler, J., and J. Robinson. 1995. Measurement and Improvement of Manufacturing
Capacity (MIMAC) Final Report. Sematech Inc.
Hung, Y.-F., and R. Leachman. 1999. Reduced simulation models of wafer fabrication
facilities. International journal of production research 37:2685–2701.
470 Conceptual Modeling for Discrete-Event Simulation
Jain, S., C.-C. Lim, and Y.-H. Low. 1999. Bottleneck based modeling of semiconductor
supply chains. International Conference on Modeling and Analysis of Semiconductor
Manufacturing, 340–345.
Jain, S., and C.-C Lim. 1999. Criticality of detailed modeling in semiconductor sup-
ply chain simulation. In Proceedings of the 1999 Winter Simulation Conference,
888–896.
Peikert, A., J. Thoma, and S. Brown. 1998. A rapid modeling technique for measur-
able improvements in factory performance. In Proceedings of the 1998 Winter
Simulation Conference, 1011–1015.
Robinson, S. 2008. Conceptual modeling for simulation Part I: Definition and require-
ments. Journal of the operational research society 59:278–290.
Rose, O. 1998. WIP evolution of a semiconductor factory after a bottleneck workcenter
breakdown. In Proceedings of the 1998 Winter Simulation Conference, 997–1003.
Rose, O. 1999. Conload: A new lot release rule for semiconductor wafer fabs. In
Proceedings of the 1999 Winter Simulation Conference, 850–855.
Rose, O. 2000. Estimation of the cycle time distribution of a wafer fab by a simple
simulation model. In Proceedings of the SMOMS ’99e, 133–138.
Rose, O. 2000. Why do simple wafer fab models fail in certain scenarios? In Proceedings
of the 2000 Winter Simulation Conference, 1481–1490.
Rose, O. 2002. Some issues of the critical ratio dispatch rule in semiconductor manu-
facturing. In Proceedings of the 2002 Winter Simulation Conference, 1401–1405.
Rose, O. 2003. Accelerating products under due-date oriented dispatching rules
in semiconductor manufacturing. In Proceedings of the 2003 Winter Simulation
Conference, 1346–1350.
Rose, O. 2007. Improving the accuracy of simple simulation models for complex pro-
duction systems. In Proceedings of the 2007 INFORMS Simulation Society Research
workshop.
Part VI
Conclusion
18
Conceptual Modeling: Past, Present,
and Future
Contents
18.1 Introduction............................................................................................... 473
18.2 Foundations of Conceptual Modeling................................................... 475
18.3 Conceptual Modeling Frameworks........................................................ 476
18.4 Soft Systems Methodology for Conceptual Modeling.........................480
18.5 Software Engineering for Conceptual Modeling................................. 481
18.6 Domain-Specific Conceptual Modeling................................................. 486
18.7 The Current State of Conceptual Modeling and Future Research..... 488
Acknowledgment................................................................................................. 489
References.............................................................................................................. 489
18.1 Introduction
Conceptual modeling is probably the most difficult part of the process
of developing and using simulation models (Law 1991). Despite this fact,
conceptual modeling is largely ignored at conferences and in the litera-
ture. This book addresses this issue by considering the body of research
for the field as it is beginning to develop. In this way it aims to create
a point of reference, highlight current research and identify avenues for
future research.
The objective of this chapter is to provide an overall summary and assess-
ment of the current state of research in conceptual modeling as set out in the
previous chapters, and to highlight the opportunities for future research.
As a starting point for this chapter we will use the research agenda set out
as a list of research themes (shown in Table 18.1) in the editorial of a recent
special issue on conceptual modeling (Robinson 2007). It builds on earlier
literature reviews by Robinson (2006, 2008), and the outcomes of a themed
day on conceptual modeling following SW06 (The Operational Research
Society Simulation Workshop 2006).
473
474 Conceptual Modeling for Discrete-Event Simulation
Table 18.1
Research Themes for Conceptual Modeling
The Problem/Modeling Objectives
Domain (P) The Model Domain (M)
1. Use of “soft OR” as a basis for 1. Identifying dimensions for determining
determining a simulation the performance of a conceptual model
conceptual model [ch. 4, 9, 10]. [ch.1, 2, 4, 5, 7].
2. How best to work with subject 2. Comparing different models in the
matter experts in forming a same problem domain [ch. 2].
conceptual model [ch. 4, 5, 7–10]. 3. Studying expert modelers to understand
3. How to organize and structure the how they form conceptual models [ch. 3].
knowledge gained during 4. How software engineering techniques
conceptual modeling [ch. 11–15]. might aid simulation conceptual
4. Alternative sources of contextual modeling [ch. 8, 11–16].
data/information for conceptual 5. Adopting/developing appropriate
modeling, including paper, model representation methods [ch. 5, 6,
interview and electronic sources. 11–16].
5. Developing curricula to include 6. Exploring methods of model
conceptual modeling in university simplification [ch. 1, 2, 17].
and industry courses on simulation 7. Identifying, adapting and developing
[ch. 3, 16]. conceptual modeling frameworks [ch.
4–8, 16].
8. Refining models through agreement
between the modeler and
stakeholders—“convergent design” [ch.
4, 5, 16].
9. Exploring the creative aspects of
modeling [ch. 10].
10. Understanding the organizational
diffusion and acceptance of models [Ch.
4, 5, 8–10, 12, 13].
11. Investigating the impact of other
modeling tasks on the conceptual model
(iteration in the simulation life cycle).
12. Understanding the effect of throw-away
models versus models with longevity—
for example, the time spent on
conceptual modeling, documentation
and organizational diffusion.
Source: Robinson, S., Journal of simulation, 1(3), 2007.
Taking each of Parts I–V of the book in turn (Sections 18.2–18.6), the chapters
in the book are reviewed and cross-referenced to the research themes in Table
18.1. The themes are numbered 1–5 for the problem domain and 1–12 for the
model domain and are referenced by a letter representing the domain (P or
M) and their number in the table. For example, M6 refers to the model domain
theme “Exploring methods of model simplification.” Finally, in Section 18.7,
Table 18.1 is reconsidered in the light of the research reported in this book.
Conceptual Modeling: Past, Present, and Future 475
Table 18.2
Conceptual Modeling Frameworks: A Classification
Chapter Chapter title Starting points Domain Focus Activities Actors Resources
4 A framework for • Little discipline Operations • Identify a sequence All • Modelers • Ordering and
simulation in conceptual systems in of activities (novice, detailing of key
conceptual modeling business • Associate guidelines experts) activities
modeling • Lack of a and methods with • Clients • Guidelines and
framework for activities • Domain methods
developing experts supporting
simulation activities
conceptual • Simple means for
models of documenting
operations outcomes of
systems activities
5 Developing • Need for Manufacturing, • Highlight Specify • Modelers • Decomposition
participative participative supply chains foundations of model • Stakeholders, principles for
simulation engineering & insightful modeling, contents especially simulation
models— modeling i.e., decomposition staff modeling
framing following from principles relevant • Reference
decomposition new business for a field of architecture for
principles for configurations interest. manufacturing and
joint and management • Guide and illustrate supply chain
understanding concepts construction and simulation
• Implicit application of • Object-oriented
modeling of key domain related notation for model
logistic elements reference specification
architectures for
specifying
simulation models
Conceptual Modeling for Discrete-Event Simulation
6 The ABCmod • No language for Discrete Event • Defining non- Specify • Specification
conceptual adequately Dynamic software-specific model language
modeling characterizing Systems language for inputs, addressing entities
framework behavior of specifying Discrete contents, and their behavior,
Discrete Event Event Dynamic and underlying data,
Dynamic Systems, clearly outputs. model inputs and
Systems because separating outputs at both a
of their diversity structural and high and low level
and complexity behavioral aspects. of detail.
• Practice of • Specifying
directly leaping conceptual models
into the at low and high
intricacies of levels of abstraction
computer
software instead
of the model.
7 Conceptual • No systematic Military, • Review of All • Modelers • Description of
modeling guidance on how business approaches, • Domain approaches.
notations and to develop a frameworks and experts • Comparison of
techniques conceptual methods. approaches for the
Conceptual Modeling: Past, Present, and Future
modeling tools under the main categories of the ability to describe the dif-
ferent aspects of a discrete-event system, the ease of use and understanding,
the ability to use concepts that would be understood by system personnel,
and the visualization capability. There is a particular focus here on the tool
being easy to use and, with the latter two criteria, on facilitating good com-
munication with and involvement of staff in the organization (M10). From
a review of existing process modeling tools (Ryan and Heavey 2006) they
found that all the tools they looked at had weaknesses in at least some of
these areas. They therefore developed the SAD technique with the aim that
it would perform well under each of the criteria as well as supporting project
teamwork. The technique produces a SAD diagram that includes entities,
resources, states, actions (making up the activities and events), queues, infor-
mation, and the relationships between all these elements. The technique also
has an elaboration language for providing additional information. Ryan and
Heavey have also developed a prototype software application, PMS (Process
Modeling Software), in which SAD models can be built.
Chapter 12 also describes part of one of the five case studies so far carried
out by Ryan and Heavey to evaluate SAD and compares the SAD model with
an IDEF3 model for the same case study. They identify various aspects of
SAD that require further development including multiple modeling views to
enable alternative conceptual models to be built and compared, and a step
through facility. It also requires further usage for additional validation and
evaluation.
In Chapter 13, Onggo emphasizes the importance of representing the
conceptual model in a way that it can be understood easily by the differ-
ent stakeholders (M10). This matches with some of the criteria of Ryan and
Heavey in Chapter 12. Starting from the methods for documenting a con-
ceptual model reported in a survey (Wang and Brooks 2007), he categorizes
the methods as textual, pictorial, and multifaceted. The different categories
are discussed and evaluated with examples from a generic hospital simula-
tion project.
Some advantages of textual representation are considered by Onggo to
be speed and flexibility, but with the disadvantages of potential ambiguity,
inability to use mathematical methods for verification, and possible commu-
nication problems depending partly upon how well it is written and tailored
to the stakeholders. Pictorial representation of conceptual models is usu-
ally by diagrams, and activity cycle diagrams, process flow diagrams (using
the business process diagram as an example) and event relationship graphs
are discussed. In general, a picture can be very effective in helping to com-
municate complex information. The multifaceted representation consists of
several elements and these may be a mixture of text and diagrams. Onggo
discusses UML and SysML as examples of this.
He also describes his proposed unified conceptual model (Onggo 2009),
which is a multifaceted approach. For describing the problem he proposes
the use of an objective diagram with the possible addition of a purposeful
484 Conceptual Modeling for Discrete-Event Simulation
activity model for the objectives, an influence diagram for inputs and out-
puts, a business process diagram with text for the system contents to be
included in the model, and text and a data dictionary for data requirements.
The model representation depends on the type of model with an activity
cycle diagram or event relationship graph suggested for discrete-event
simulation, a stock and flow diagram or causal loop diagram for system
dynamics, and a flow chart, business process diagram, or UML activity
diagram for agent-based simulation. However, much more testing of these
methods is required for a more complete evaluation of their usefulness in
different circumstances.
Tolk et al. take a different focus in Chapter 14, by concentrating on model
interoperability and composition. They define interoperability for two sys-
tems as meaning “they are able to work together to support a common
objective” and so relates to issues such as software and exchange of data.
Composability refers to more abstract modeling issues regarding whether
it makes sense to combine models together (e.g., they do not have contra-
dictory assumptions). Based on these definitions they state that “interop-
erability of simulation systems requires composability of conceptual
models.” Therefore conceptual model composability is one of the require-
ments for it to be appropriate and feasible to combine simulation models
or systems together.
Tolk et al. consider that the goal in their area of research is for concep-
tual models to be constructed in a format that enables them to be machine
understandable for automatic reasoning about and combining of these mod-
els. They describe the Levels of Conceptual Interoperability Model (LCIM),
where the addition of each of six successive levels (in this order: technical,
syntactic, semantic, pragmatic, dynamic, conceptual) enables greater inter-
operation. They consider that data engineering addresses the first three of
these levels and part of the pragmatic level. It consists of the administration,
management, alignment, and transformation of data and they discuss each
of these steps. Process engineering deals with the pragmatic and dynamic
layers and, similar to data engineering, Tolk et al. suggest the four steps of
administration, management, alignment, and transformation. Constraint
engineering covers the conceptual level of identifying the assumptions, con-
straints, and simplifications. For each of the three engineering methods (data
engineering, process engineering, constraint engineering) Tolk et al. discuss
what is required for the information to be machine readable. However, they
conclude that “the solutions provided by current standards as described in
this chapter are not sufficient,” indicating scope for plenty of further work
in this area.
In Chapter 15, Tanriöver and Bilgen discuss the verification and validation
of conceptual models built using UML. The general relationships are that the
conceptual model can be compared with the real system to assess whether
it is a suitable representation for the purpose of the study (validation), the
built simulation model can be compared with the conceptual model to assess
Conceptual Modeling: Past, Present, and Future 485
whether it has been built correctly (verification), and the conceptual model
itself can be examined and tested for certain desirable properties such as
consistency and lack of redundancy.
Tanriöver and Bilgen review the verification and validation literature for
simulation conceptual modeling. They note that the general verification
and validation principles and methods for the different simulation tasks
can be applied, but that there is a lack of literature on the internal verifica-
tion of conceptual models. They also review the literature on verification
and validation of UML models where there are various lists of desirable
properties, and some formal techniques and tools for different aspects of
verification. They advocate a number of advantages of an informal inspec-
tion approach for verification of UML conceptual models rather than for-
mal methods (such as the inspection approach being easier to apply and
understand, and more suitable for the subjective nature of conceptual
models where the assessment may require expert evaluation). They then
develop a systematic inspection process including looking for specific
“deficiency patterns” in the UML diagrams, which might indicate contra-
dictions or redundancy, and carrying out a defined set of inspection tasks.
They apply this to two case studies in the military domain. A considerable
number of issues with the conceptual models are identified by the pro-
cess in both cases, demonstrating the potential usefulness of the method.
Continuing the research in this area could include identifying required or
desirable internal properties for conceptual models in UML or other lan-
guages, further development and testing of the inspection process, adding
a risk perspective to the process, and the development of software tools to
assist the process.
In terms of the research themes of Table 18.1, these five chapters particu-
larly focus on organizing and structuring the information about the concep-
tual model (P3), the use of software engineering techniques (M4), and model
representation methods (M5).
Combining the comments in these chapters, there are various characteris-
tics of software tools that are desirable in some or all circumstances:
• Quick to learn
• Easy to use
• Enable all aspects of the conceptual model to be captured
• Enable the conceptual model to be changed easily
• Enable alternative conceptual models to be compared
• Produce suitable documentation that is easy to understand
• Good visualization features
• Facilitate online collaboration
• Facilitate reuse, interoperability, and composition
• The data are machine readable
486 Conceptual Modeling for Discrete-Event Simulation
Note how an iterative pattern is foreseen for steps 1–3, as there may be no a
priori fit of the simplified model.
In their study, Sprenger and Rose illustrate the need and means for model
simplification in a specific context. Relevance of such approaches is increasing
due to growing complexity of business configurations. This suggests room
488 Conceptual Modeling for Discrete-Event Simulation
for more elaborate methods of simplification (M6). Typically, they will start
from basic techniques for model simplification (cf. Chapters 1 and 2), which
are combined, extended, and/or detailed for a specific domain of interest. As
an essential feature methods should address procedures for testing the fit of
the simplified model.
Nevertheless, the discussions in this chapter show that there are many
unanswered questions and opportunities for further research under all the
themes in Table 18.1. Conceptual modeling is a vital step in any simulation
project and research on it has the potential for making great contributions to
the success of simulation studies. We hope that this book will help to inspire
further research on the important topic of conceptual modeling for discrete-
event simulation.
Acknowledgment
Table 18.1 is based on Table 1 in Robinson, S. 2007. The future’s bright the
future’s … Conceptual modelling for simulation!: Editorial. Journal of
Simulation 1(3): 149–152. © 2008 Operational Research Society Ltd. Reproduced
with permission of Palgrave Macmillan.
References
Guru, A., and P. Savory. 2004. A template-based conceptual modeling infrastructure
for simulation of physical security systems. In Proceedings of the 2004 Winter
Simulation Conference, ed. R.G. Ingalls, M.D. Rossetti, J.S. Smith, and B.A. Peters,
866–873. Piscataway, NJ: IEEE.
Kettinger, W.J., J.T.C. Teng, and S. Guha. 1997. Business process change: A study of
methodologies, techniques, and tools. MIS quarterly 21(1): 55–80.
Law, A.M. 1991. Simulation model’s level of detail determines effectiveness. Industrial
engineering 23(10): 16–18.
Mathirajan, M., and A.I. Sivakumar. 2006. Literature review, classification and simple
meta-analysis on scheduling of batch processors in semiconductor. International
journal of advanced manufacturing technology 29: 990–1001.
Nance, R.E. 1994. The conical methodology and the evolution of simulation model
development. Annals of operations research 53: 1–45.
Onggo, B.S.S. 2009. Towards a unified conceptual model representation: A case study
in health care. Journal of simulation 3(1): 40–49.
Pace, D.K. 1999. Development and documentation of a simulation conceptual model.
In Proceedings of the 1999 Fall Simulation Interoperability Workshop. http://www.
sisostds.org, accessed March 19, 2010.
Pace, D.K. 2000. Simulation conceptual model development. In Proceedings of the 2000
Spring Simulation Interoperability Workshop. http://www.sisostds.org, accessed
March 19, 2010.
Robinson, S. 2006. Issues in conceptual modelling for simulation: Setting a research
agenda. In Proceedings of the Third Operational Research Society Simulation
Workshop (SW06), ed. J. Garnett, S. Brailsford, S. Robinson, and S. Taylor, 165–174.
Birmingham, UK: The Operational Research Society.
490 Conceptual Modeling for Discrete-Event Simulation
Robinson, S. 2007. The future’s bright the future’s … Conceptual modelling for simu-
lation!: Editorial. Journal of simulation 1(3): 149–152.
Robinson, S. 2008. Conceptual modelling for simulation part I: Definition and require-
ments. Journal of the operational research society 59(3): 278–290.
Ryan, J., and Heavey, C. 2006. Process modelling for simulation. Computers in industry
57: 437–450.
Shannon, R.E. 1975. Systems Simulation: The Art and Science. Englewood Cliffs, NJ:
Prentice-Hall.
Wang, W., and R.J. Brooks. 2007. Empirical investigations of conceptual modeling
and the modeling process. In Proceedings of the 2007 Winter Simulation Conference,
ed. S.G. Henderson, B. Biller, M.-H. Hsieh, J. Shortle, J.D. Tew, and R.R. Barton,
762–770. Piscataway, NJ: IEEE Computer Society Press.
Willard, B. 2007. UML for systems engineering. Computer standards & interfaces 29:
69–81.
Index
A identifiers, 145–146
role and scope, 144–145
ABCmod conceptual modeling
state variables, 149–150
framework, 25
exploring structural and behavioral
ACD, 137
requirements
activity-oriented modeling
DEDS description, 139
approach, 136
discrete-event dynamic
activity-scanning approach, 137
system, 138
alternatives
features, 140–141
consumer, 144–145
graphical representation, 140
group, 144
mapping process, 139–140
queue, 144
merchandize area, 138
resource, 144
structural elements, 140
behavior diagram, 137
features of, 136
Bigtown Garage, example project, 165
input
conceptual model, 168–176
characterizing sequence, 157, 159
detailed goals and output, 167–168
DEDS domain, constituents
general project goals, 166
of, 157
SUI details, 166–167
domain sequence for, 158–159
SUI overview, 166
features of, 159
characterization of behavior in
format, 159
action constructs, 155–157
probability distribution
activity constructs, 150–155
functions, 159
components, 135
PWC function, 158
concepts of, 134
range sequence, 158
constituents
stochastic characterization, 158
activity and action constructs, 138
template for, 160
behavior constructs, 138
variables, 158
entities, 138
life-cycle diagram, 137
structures, 138
methodology for
data modules, 162
detailed-level formulation, 165
derives, 163
high-level formulation, 164–165
EntityStructureName, 163
model structure
GroupName, 163
entity structures and entities,
InsertGrp, 163
143–144
InsertQueHead, 163
notion of activity, 135
leave(Ident), 163
output, 159
QueueName, 163
documentation for, 161
RemoveGrp, 163
information, 161
template for, 163
sample variable, 161
terminate, 163
from simulation experiment, 161
value, 163
template for, 162
entity structure
trajectory set, 161
attributes, 146–149
variables, 161
491
492 Index
K M
KAMA conceptual modeling Mathematical model, 31
framework, 183 MBDE, see Model-Based Data
command hierarchy diagram with Engineering (MBDE)
redundancy, 409 MDR standard, see Metadata Registry
KAMA method (MDR) standard
entity state diagrams, 184 MDSD tool, see Model Driven Software
flow diagram for, 185 Development (MDSD) tool
knowledge acquisition Measurement and Improvement of
(KA), 184 Manufacturing Capacities
mission space, 184 (MIMAC), 456
mission space diagrams, 184 Message Handling Centre, software
task flow diagrams, 184 project life cycle and
KAMA notation cost-effective way, 223–224
Foundation package, 186 design of, 223
metamodel diagram of, 186 requirements analysis, 224
metamodel elements, 187 simulation activities
Mission Space package, 186 disadvantages, 225–226
Structure package, 186 simulation design
KAMA tool, 191 requirements, 226–227
package hierarchy, 188 simulation structure, 224–225
sample mission space, 187–191 Metadata Registry (MDR) standard
task flow diagram examples, 410 conceptual domain, 365
KIRC, see Knowledge Integration property
Resource Center (KIRC) domain, 365–366
Knowledge acquisition (KA), 184 instances, 366
Knowledge Integration Resource Center value domain, 366
(KIRC), 428 Meta Object Facility (MOF), 393
KnowledgeMetaMetaModel Micro Saint Sharp software, 59
(KM3), 183 Military simulation
modelers, 10
MIMAC, see Measurement
L and Improvement of
Levels of conceptual interoperability Manufacturing Capacities
model (LCIM), 362, 484 (MIMAC)
levels of interoperation Minimizing adjustment time
conceptual level, 364 application of methods, 466
dynamic level, 363 nonlinear regression, 465
evolution, 363 regression over regression, 466
pragmatic level, 363 Mission space, 182
semantic level, 363 sample, 189
syntactic level, 363 extending missions, 187
technical level, 363 extensionId information, 191
Lexicon component, 200 objectives, 187
Life cycle and model verification, 11 performanceCriteria
Local intelligence, 116 attribute, 188
“Lumped model,” 8–9 report, 191
lumping of, 23 task flow diagrams, 188, 190–191
502 Index