We characterize a tractable class of feedback decomposition problems to operationalize steady-sta... more We characterize a tractable class of feedback decomposition problems to operationalize steady-state, lumped-parameter engineering models of algebraic equations into high-performance, numerical simulation software. An algebraic ordering graph captures how the parameter of the model algebraically depends on values of other model parameters. Using graph matching techniques, we construct a necessary and sufficient validity test for a set of n equations and n parameters in the sense that there exists a program to numerically compute a solution to these equations. The sufficiency of this test constitutes a useful and tractable refinement of the conventional condition that there be as many equations as unknown parameters. We operationalize this validity test into a prescriptive matching algorithm for constructing valid algebraic orderings. Our approach accounts for limitations in the symbolic solvability of algebraic equations and modeler-imposed restrictions. Parameters are allowed to be implicitly constrained (by all equations that refer to it), properly constrained (by a unique equation), or over-constrained (by multiple equations). Equations are allowed to properly constrain a unique parameter or over-constrain the values of all its parameters. Interpreting each path through the dependency graph as a sequence of numerical computations and each cycle as a potential feedback loop, we show how to exploit domain-specific properties of physical and algebraic feedback loops to make a hierarchical feedback analysis tractable. Tractability stems from exploiting the sparseness inherent to lumped-parameter modeling and the causality typical of hydro-thermal models. Decomposability is guaranteed when each feedback loop is identical to a strongly-connected sub-component of the algebraic ordering graph and the collection of all feedback loops must be mutually disjoint at each level of the hierarchy. Experimental results substantiate that better, faster, and cheaper equation-solving programs result from an inherent reduction of the number of parameters solved simultaneously. Comprehensive modeling and numerical simulation is made of a two-phase ammonia thermal controller. Further empirical evidence of the robustness of our approach is provided by comparing our models to Biswas' model and through a preliminary study of quantitative diagnosis.
Page 1. Model-Based Code Generation: Past, Present & Future Nicolas Rouquette, GregoryHorvath... more Page 1. Model-Based Code Generation: Past, Present & Future Nicolas Rouquette, GregoryHorvath JPL Mathworks Day October 29 2003 Page 2. n uverview History of Model-Based Code Generation Past Activities H Deep Space 1's 13th Technology Present Activities ...
This position paper reflects upon experiences and lessons learned from several spacecraft project... more This position paper reflects upon experiences and lessons learned from several spacecraft projects at JPL and JPL's recent collaboration with software architecture researchers from USC and CMU.
Quantitative diagnosis involves numerically estimating the values of unobservable parameters that... more Quantitative diagnosis involves numerically estimating the values of unobservable parameters that best explain the observed parameter values. We consider quantitative diagnosis for continuous, lumped- parameter, steady-state physical systems because such models are easy to construct and the diagnosis problem is considerably simpler than that for corresponding dynamic models. To further tackle the difficulties of numerically inverting a simulation model to compute a diagnosis, we propose to decompose a physical system model in terms of feedback loops. This decomposition reduces the dimension of the problem and consequently decreases the diagnosis search space. We illustrate this approach on a model of thermal control system studied in earlier research.
This chapter reports results from two recent studies of how operational experience with mission-c... more This chapter reports results from two recent studies of how operational experience with mission-critical product lines can enhance knowledge management for use with their future products. The challenge was how to propagate new requirements knowledge forward in a product line in ways that projects will use. In the first product line, the concern was capture and retention of requirements knowledge exposed by defects that occurred during operations. This led to two mechanisms not traditionally associated with requirements management – feature models extended with assumption specifications (formal) and structured anecdotes of paradigmatic product-line defects (informal). In the second product line, the traditional notion of binding time in a product line did not accurately reflect the timing of project decisions. This led to a definition of product-line binding times that better accommodates the varying requirements of the different missions using the product line. It appears that the practical techniques reported here to build requirements knowledge into software product lines in the spacecraft domain also are useful in other product-line developments.
This paper advocates for a unification of modeling & programming from the perspective of norm... more This paper advocates for a unification of modeling & programming from the perspective of normalized, implementation-neutral database schemas: representing programs and models in terms of irreducible and independent tables. This idea departs from the mainstream of modeling & programming, which typically revolves around Application Program Interface (API) ecosystems for operational needs and external serialization for interchange needs. Instead, this idea emphasizes an information-centric architecture to separate the structural aspects of language syntax via normalized schema tables from the operational aspects of language syntax and semantics via programs operating on normalized tables or derived table views. Such tables constitute the basis of a functional information architecture unifying modeling and programming as a radical departure from standardizing APIs in a programming fashion or standardizing serialization interchange in a modeling fashion. This paper focuses on the current API-less serialization-centric modeling paradigm because it is the farthest from a unified functional information architecture compared to functional programming languages where thinking about programs as pure functions and models as pure data is closest to this kind of unification. This paper first deconstructs the multi-level, reflective architecture for modeling languages defined at the Object Management Group (OMG) based on the Meta-Object Facility (MOF) and the Unified Modeling Language (UML) and subsequently reconstructs several normalized schema accounting for the information content and organization of different kinds of resources involved in modeling: libraries of datatypes, metamodels like UML, profiles like the Systems Modeling Language (SysML) that extend metamodels and models that conform to metamodels optionally extended with applied profiles.
In this paper we describe a system called the Computer Aided Engineering for Spacecraft System Ar... more In this paper we describe a system called the Computer Aided Engineering for Spacecraft System Architectures Tool Suite, or CAESAR for short, a platform for enabling model-based system engineering (MBSE). CAESAR recognizes that engineers are already likely to use models, but they typically keep the models private, only interpreting model information into documents or presentations that become project baseline. MBSE needs to enable more automated sharing of information directly between models to ensure model consistency, improve the rigor of engineering process, and ultimately, reduce the effort needed to get a clear answer to engineering questions. We explain the features of CAESAR, and describe how these features were leveraged in a case study where CAESAR was used to develop a model-based process for spacecraft electrical interface design and harness specification for the Europa Clipper flight project.
We characterize a tractable class of feedback decomposition problems to operationalize steady-sta... more We characterize a tractable class of feedback decomposition problems to operationalize steady-state, lumped-parameter engineering models of algebraic equations into high-performance, numerical simulation software. An algebraic ordering graph captures how the parameter of the model algebraically depends on values of other model parameters. Using graph matching techniques, we construct a necessary and sufficient validity test for a set of n equations and n parameters in the sense that there exists a program to numerically compute a solution to these equations. The sufficiency of this test constitutes a useful and tractable refinement of the conventional condition that there be as many equations as unknown parameters. We operationalize this validity test into a prescriptive matching algorithm for constructing valid algebraic orderings. Our approach accounts for limitations in the symbolic solvability of algebraic equations and modeler-imposed restrictions. Parameters are allowed to be implicitly constrained (by all equations that refer to it), properly constrained (by a unique equation), or over-constrained (by multiple equations). Equations are allowed to properly constrain a unique parameter or over-constrain the values of all its parameters. Interpreting each path through the dependency graph as a sequence of numerical computations and each cycle as a potential feedback loop, we show how to exploit domain-specific properties of physical and algebraic feedback loops to make a hierarchical feedback analysis tractable. Tractability stems from exploiting the sparseness inherent to lumped-parameter modeling and the causality typical of hydro-thermal models. Decomposability is guaranteed when each feedback loop is identical to a strongly-connected sub-component of the algebraic ordering graph and the collection of all feedback loops must be mutually disjoint at each level of the hierarchy. Experimental results substantiate that better, faster, and cheaper equation-solving programs result from an inherent reduction of the number of parameters solved simultaneously. Comprehensive modeling and numerical simulation is made of a two-phase ammonia thermal controller. Further empirical evidence of the robustness of our approach is provided by comparing our models to Biswas' model and through a preliminary study of quantitative diagnosis.
Page 1. Model-Based Code Generation: Past, Present & Future Nicolas Rouquette, GregoryHorvath... more Page 1. Model-Based Code Generation: Past, Present & Future Nicolas Rouquette, GregoryHorvath JPL Mathworks Day October 29 2003 Page 2. n uverview History of Model-Based Code Generation Past Activities H Deep Space 1's 13th Technology Present Activities ...
This position paper reflects upon experiences and lessons learned from several spacecraft project... more This position paper reflects upon experiences and lessons learned from several spacecraft projects at JPL and JPL's recent collaboration with software architecture researchers from USC and CMU.
Quantitative diagnosis involves numerically estimating the values of unobservable parameters that... more Quantitative diagnosis involves numerically estimating the values of unobservable parameters that best explain the observed parameter values. We consider quantitative diagnosis for continuous, lumped- parameter, steady-state physical systems because such models are easy to construct and the diagnosis problem is considerably simpler than that for corresponding dynamic models. To further tackle the difficulties of numerically inverting a simulation model to compute a diagnosis, we propose to decompose a physical system model in terms of feedback loops. This decomposition reduces the dimension of the problem and consequently decreases the diagnosis search space. We illustrate this approach on a model of thermal control system studied in earlier research.
This chapter reports results from two recent studies of how operational experience with mission-c... more This chapter reports results from two recent studies of how operational experience with mission-critical product lines can enhance knowledge management for use with their future products. The challenge was how to propagate new requirements knowledge forward in a product line in ways that projects will use. In the first product line, the concern was capture and retention of requirements knowledge exposed by defects that occurred during operations. This led to two mechanisms not traditionally associated with requirements management – feature models extended with assumption specifications (formal) and structured anecdotes of paradigmatic product-line defects (informal). In the second product line, the traditional notion of binding time in a product line did not accurately reflect the timing of project decisions. This led to a definition of product-line binding times that better accommodates the varying requirements of the different missions using the product line. It appears that the practical techniques reported here to build requirements knowledge into software product lines in the spacecraft domain also are useful in other product-line developments.
This paper advocates for a unification of modeling & programming from the perspective of norm... more This paper advocates for a unification of modeling & programming from the perspective of normalized, implementation-neutral database schemas: representing programs and models in terms of irreducible and independent tables. This idea departs from the mainstream of modeling & programming, which typically revolves around Application Program Interface (API) ecosystems for operational needs and external serialization for interchange needs. Instead, this idea emphasizes an information-centric architecture to separate the structural aspects of language syntax via normalized schema tables from the operational aspects of language syntax and semantics via programs operating on normalized tables or derived table views. Such tables constitute the basis of a functional information architecture unifying modeling and programming as a radical departure from standardizing APIs in a programming fashion or standardizing serialization interchange in a modeling fashion. This paper focuses on the current API-less serialization-centric modeling paradigm because it is the farthest from a unified functional information architecture compared to functional programming languages where thinking about programs as pure functions and models as pure data is closest to this kind of unification. This paper first deconstructs the multi-level, reflective architecture for modeling languages defined at the Object Management Group (OMG) based on the Meta-Object Facility (MOF) and the Unified Modeling Language (UML) and subsequently reconstructs several normalized schema accounting for the information content and organization of different kinds of resources involved in modeling: libraries of datatypes, metamodels like UML, profiles like the Systems Modeling Language (SysML) that extend metamodels and models that conform to metamodels optionally extended with applied profiles.
In this paper we describe a system called the Computer Aided Engineering for Spacecraft System Ar... more In this paper we describe a system called the Computer Aided Engineering for Spacecraft System Architectures Tool Suite, or CAESAR for short, a platform for enabling model-based system engineering (MBSE). CAESAR recognizes that engineers are already likely to use models, but they typically keep the models private, only interpreting model information into documents or presentations that become project baseline. MBSE needs to enable more automated sharing of information directly between models to ensure model consistency, improve the rigor of engineering process, and ultimately, reduce the effort needed to get a clear answer to engineering questions. We explain the features of CAESAR, and describe how these features were leveraged in a case study where CAESAR was used to develop a model-based process for spacecraft electrical interface design and harness specification for the Europa Clipper flight project.
Uploads
Papers by Nicolas Rouquette