Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
Ted Bapty

    Ted Bapty

    Design of Cyber Physical Systems is a complex and challenging endeavor. Cyber Physical Systems by definition span a broad spectrum of design domains -- cyber and multi-physics; design abstractions -- hybrid dynamics, ODE's, PDE's,... more
    Design of Cyber Physical Systems is a complex and challenging endeavor. Cyber Physical Systems by definition span a broad spectrum of design domains -- cyber and multi-physics; design abstractions -- hybrid dynamics, ODE's, PDE's, solid geometry, among others; and multiple design disciplines and tools. The design has to satisfy a large number of, often conflicting, requirements spanning performance, structural, manufacturability, reliability, maintainability, and cost. Current systems engineering methodologies rely largely on the traditional approach commonly known as the System V method. The V method follows a top-down decomposition of design into subsystems, partitioning requirements along well established disciplines and teams, while going "down the V". The subsystems are brought together for integration and testing at a later phase while going "up the V". The challenge arises from unanticipated interactions that are "discovered" during integration, requiring costly design iterations. The result of this has been an untenable trend-line of cost and schedule vs complexity for military and aerospace systems. Augustine postulated, based on this trend-line that, without a revolutionary design approach, cost and schedule will put new globally competitive military systems out of budgetary reach.We present the OpenMETA toolchain that was created in DARPA's AVM program to address the challenges of complex Cyber Physical Systems design articulated above. OpenMETA implements a design methodology that can be characterized as progressive constraint-based design space refinement, and relies on three core principles, as described below: component-based design, design space exploration, and automated requirement driven analysis. Component-Based Design supports design reuse, leveraging well-designed component models, configured in architectures designed to achieve system requirements. Design space models allow representation of flexible design decisions, with associated Design Space Exploration tools to find sets of feasible designs. Design spaces allow significant design adaptability, a major goal of AVM, and also minimize design churns that are caused by requirement creep traditionally necessitating retaining and carrying forward a large design space through progressive refinement. Instead of changing a single point design as requirements evolve, in the OpenMETA methodology the design space is refined, typically through sub-setting. Executable Requirements support automated evaluation of system metrics across entire feasible design spaces, significantly reducing the cost of system validation. When used with the supported probabilistic methods, functional correctness can be assured within the defined probabilistic bounds that account for the multiple sources of uncertainty in the design process, including but not limited to model uncertainty. This OpenMETA design flow is implemented with a comprehensive set of tools, editors, and frameworks that maximally leverage available best-of-class commercial and open-source engineering tools, while reducing the expertise in each of these tools via abstraction and design automation. The toolset is built upon Meta Programmable Tools and a Semantic Backplane that support extensibility and semantically precise integration of domains and tools.In OpenMETA, design decisions are made based on models of the components and systems. A significant challenge remains however, with respect to the validity and fidelity of the models. The Dynamic Data Driven application approach represents an opportunity to address this challenge. The DDDAS methods and infrastructure, could allow us to close loop between the system design process (models and tools) and the operational system. Instrumentation and observations from the operational system can be used as a feedback to tune the design models to match the realworld observed behavior. The refined models, in turn, can be used to optimize the system. In this workshop, we will present an early exploration of techniques for model adaptation and evolution that are being investigated for integration in the OpenMETA toolchain.
    Cyber-Physical Systems (CPS) are establishing heterogeneous engineering domains leading to engineering processes that span multiple design disciplines with separate modeling approaches, design flows and supporting tool suites. One of the... more
    Cyber-Physical Systems (CPS) are establishing heterogeneous engineering domains leading to engineering processes that span multiple design disciplines with separate modeling approaches, design flows and supporting tool suites. One of the challenges of design automation in CPS is the deep integration of models, tools and design flows such that design trade-offs across traditionally isolated design disciplines is facilitated. In this paper we overview experience and results gained along the implementation of an experimental design automation tool suite, OpenMETA, created for a complex CPS design challenge in the ground vehicle domain. The focus of the paper is domain agnostic methods and tools providing infrastructure for the model- and tool- integration platforms in OpenMETA. We present the arguments leading to the creation of the integration platforms instead of pursuing ad-hoc integration of heterogeneous tools and provide details on facilitating semantic integration.
    Reliability and safety are important properties in the development of complex cyber-physical systems such as autonomous vehicles. Achieving a reliable autonomous vehicle is a challenging problem, as the unpredictability of the environment... more
    Reliability and safety are important properties in the development of complex cyber-physical systems such as autonomous vehicles. Achieving a reliable autonomous vehicle is a challenging problem, as the unpredictability of the environment demands a reliable design methodology. Additionally, current testing procedures for ADAS features on vehicles are exhausting and time-consuming, so that a better way to do testing is required. In order to address these challenges, we propose a co-simulation tool-chain which integrates multiple simulation environments, optimizes the control parameters of autonomous vehicle software based on metrics, and visualizes the vehicle behavior using a video game engine.
    Over the past few years, there has been an immense thrust in the field of geolocation, surveillance, tracking and location-aware systems and services. The advent of compact, low power, high-processing power DSPs have made possible several... more
    Over the past few years, there has been an immense thrust in the field of geolocation, surveillance, tracking and location-aware systems and services. The advent of compact, low power, high-processing power DSPs have made possible several tasks which were infeasible just a few years ago. TDOA estimates on such energy and resource constrained platforms suffers from the lack of a coherent sampling clock. We present two related techniques of Doppler-frequency and time shift correction. The techniques are formally developed, analyzed. They are then compared from an implementation and performance perspective.
    Modern high-performance embedded systems, such as Automatic Target Recognition for Missiles or Dynamic Protocols Mobile Communications devices, face many challenges. Power and volume constraints limit hardware size. Accurate,... more
    Modern high-performance embedded systems, such as Automatic Target Recognition for Missiles or Dynamic Protocols Mobile Communications devices, face many challenges. Power and volume constraints limit hardware size. Accurate, high-performance algorithms involve massive computations. Systems must respond to demanding real-time specifications. In the past, custom application-specific architectures have been used to satisfy these demands. This implementation approach, while effective, is expensive and relatively inflexible. Hardwired application-specific architectures fail to meet requirements and become expensive to evolve and maintain. A fixed, application specific architecture will require significant redesign to assimilate new algorithms and new hardware components. Flexible systems must function in rapidly changing environments, resulting in multiple modes of operation. On the other hand, efficient hardware architectures must match algorithms to maximize performance and minimize r...
    This paper presents three contributions to the challenges of applying the OMG Model Driven Architecture (MDA) to develop and deploy distributed real-time and embedded (DRE) applications. First, we motivate our MDA tool called CoSMIC,... more
    This paper presents three contributions to the challenges of applying the OMG Model Driven Architecture (MDA) to develop and deploy distributed real-time and embedded (DRE) applications. First, we motivate our MDA tool called CoSMIC, which is based on the Model Integrated Computing (MIC) paradigm that provides the intellectual foundation for MDA. Second, we describe how CoSMIC's generative abilities can be used to configure and assemble DRE component middleware required to deploy DRE applications. Third, we ...
    There is a growing interest in the area of Advanced Separation of Concerns (ASOC). This is evident in the numerous workshops on this topic that have been offered recently at the past OOPSLA, ICSE, and ECOOP conferences. An example of the... more
    There is a growing interest in the area of Advanced Separation of Concerns (ASOC). This is evident in the numerous workshops on this topic that have been offered recently at the past OOPSLA, ICSE, and ECOOP conferences. An example of the work in this area is Aspect-Oriented Programming (AOP) 2. In AOP, new programming language constructs are provided that permit a better modularization of concerns that crosscut the solution space [Kiczales et al., 01].
    Abstract. Synthesis of source code from models typically proceeds as a direct mapping from each modeling element to the generation of a set of intentionally equivalent source code statements. When a library of components is available, the... more
    Abstract. Synthesis of source code from models typically proceeds as a direct mapping from each modeling element to the generation of a set of intentionally equivalent source code statements. When a library of components is available, the model interpreter can leverage a larger granularity of reuse by generating configurations of the available components. However, it is difficult to synthesize certain properties described in a model (eg, those related to Quality of Service) due to the closed nature of the components, as available ...
    3 Scanning the Special Issue on Modeling and Design of Embedded Software S. Sastry, J. Sztipanovits, R. Bajcsy, and H. Gill ... PAPERS 11 Hierarchical Modeling and Analysis of Embedded Systems (Invited Paper), R. Alur, T. Dang, J.... more
    3 Scanning the Special Issue on Modeling and Design of Embedded Software S. Sastry, J. Sztipanovits, R. Bajcsy, and H. Gill ... PAPERS 11 Hierarchical Modeling and Analysis of Embedded Systems (Invited Paper), R. Alur, T. Dang, J. Esposito, Y. Hur, F. Ivancic, V. Kumar, I. Lee, P. Mishra, GJ Pappas, and O. Sokolsky 29 Invisible Formal Methods for Embedded Control Systems (Invited Paper), A. Tiwari, N. Shankar, and J. Rushby 40 Physics-Based Encapsulation in Embedded Software for Distributed Sensing and Control Applications, (Invited Paper), F. Zhao, C. ...
    Research Interests:
    Implementing image-processing systems can require significant effort and resources due to information volume and algorithm complexity. Autonomic behavior is required for many image processing systems to perform consistently under... more
    Implementing image-processing systems can require significant effort and resources due to information volume and algorithm complexity. Autonomic behavior is required for many image processing systems to perform consistently under real-world conditions. Model Integrated Computing ...
    ABSTRACT This report describes Vanderbilt's contribution to the ability to build systems that use decentralized control and fault tolerance techniques to support applications such as large clusters of Micro UAVs or Organic Air... more
    ABSTRACT This report describes Vanderbilt's contribution to the ability to build systems that use decentralized control and fault tolerance techniques to support applications such as large clusters of Micro UAVs or Organic Air Vehicles. The approach of this effort was to analyze fault management requirements of formation flight for fleets of UAVs, and develop a layered fault management architecture which demonstrates significant improvement over current technology. The target demonstration was a radio-geolocation system, using 3-10 UAV mounted time-of-arrival measurement nodes and a single base station.
    Dynamically reconfigurable architecture computational devices offer a promise of high speed, low cost, and small form-factor by (1) optimizing the architecture for the application, (2) adapting to changing requirements by reallocating... more
    Dynamically reconfigurable architecture computational devices offer a promise of high speed, low cost, and small form-factor by (1) optimizing the architecture for the application, (2) adapting to changing requirements by reallocating hardware, and (3) using low-cost ...
    ... This method of pruning is implemented using a symbolic method known as ordered binary decision diagrams (OBDD). A symbolic representation is built along with a constraint set. System design alternatives that do not satisfy this... more
    ... This method of pruning is implemented using a symbolic method known as ordered binary decision diagrams (OBDD). A symbolic representation is built along with a constraint set. System design alternatives that do not satisfy this constraint set may be quickly eliminated. ...
    This paper presents a model-driven approach for generating quality-of-service (QoS) adaptation in Distributed Real-Time Embedded (DRE) Systems. The approachinvolves the creation of high-level graphical models representing the QoS... more
    This paper presents a model-driven approach for generating quality-of-service (QoS) adaptation in Distributed Real-Time Embedded (DRE) Systems. The approachinvolves the creation of high-level graphical models representing the QoS adaptationpolicies. The models are constructed using a domain-specific modeling language-theAdaptive Quality Modeling Language (AQML). Multiple generators have been developedusing the Model-Integrated Computing (MIC) framework to create low-level artifactsfor simulation and ...
    ABSTRACT Aspect-Oriented Domain-Specific Modeling (AODSM) represents the nexus between Aspect-Oriented Programming (AOP) and Model-Integrated Computing (MIC). Recently, research in the area of aspect-oriented design has concentrated on... more
    ABSTRACT Aspect-Oriented Domain-Specific Modeling (AODSM) represents the nexus between Aspect-Oriented Programming (AOP) and Model-Integrated Computing (MIC). Recently, research in the area of aspect-oriented design has concentrated on the important issues of notational and diagrammatic representation. However, the research described in this paper has brought the benefits of aspect-orientation to the modeling process itself. This paper describes numerous facets of AODSM, including: domain-specific weavers, the Embedded Constraint Language (ECL), code generation issues within a metaweaver framework, and a comparison between AODSM and AOP. A n e x ample of the approach is provided, as well as a description of several future research topics for extending the flexibility within AODSM.
    ABSTRACT This paper presents a contribution to the challenges of manually creating test configurations and deployments for high performance distributed middleware frameworks. We present our testing tool based on the Model Integrated... more
    ABSTRACT This paper presents a contribution to the challenges of manually creating test configurations and deployments for high performance distributed middleware frameworks. We present our testing tool based on the Model Integrated Computing (MIC) paradigm and describe and discuss its generative abilities that can be used to generate many test configurations and deployment scenarios from high-level system specifications through model replication.
    The report summaries key accomplishments of the META project, part of DARPA/TTO Adaptive Vehicle Make (AVM) program. The META project has developed a model, tool and execution integration platforms for the model- and component-based... more
    The report summaries key accomplishments of the META project, part of DARPA/TTO Adaptive Vehicle Make (AVM) program. The META project has developed a model, tool and execution integration platforms for the model- and component-based development of ground vehicles.
    this paper. Here a more complete overview is given about the specialties of each of these files:
    Embedded system software development is challenging, owing to a tight integration of the software and its physical environment, profoundly impacting the software technology that can be applied for constructing embedded systems. Modeling... more
    Embedded system software development is challenging, owing to a tight integration of the software and its physical environment, profoundly impacting the software technology that can be applied for constructing embedded systems. Modeling and model-based design are central to capture all essential aspects of embedded systems. Vanderbilt University's Model Integrated Computing tool suite, driven by the recognition of the need for integrated systems and software modeling, provides a reusable infrastructure for model-based design of embedded systems. The suite includes metaprogrammable model-builder (GME), model-transformation engine (UDM/GReAT), tool-integration framework (OTIF), and design space exploration tool (DESERT). The application of the MIC tool suite in constructing a tool chain for Automotive Embedded System (VCP) is presented.
    ABSTRACT Design automation tools evolved to support the principle of "separation of concerns" to manage engineering complexity. Accordingly, we find tool suites that are vertically integrated with limited support (even... more
    ABSTRACT Design automation tools evolved to support the principle of "separation of concerns" to manage engineering complexity. Accordingly, we find tool suites that are vertically integrated with limited support (even intention) for horizontal integratability (i.e. integration across disciplinary boundaries). CPS challenges these established boundaries and with this-market conditions. The question is how to facilitate reorganization and create the foundation and technologies for composable CPS design tool chains that enables reuse of existing commercial and open source tools? In this paper we describe some of the lessons learned in the design and implementation of a design automation tool suite for complex cyber-physical systems (CPS) in the vehicle domain. The tool suite followed a model-and component-based design approach to match the significant increase in design productivity experienced in several narrowly focused homogeneous domains, such as signal processing, control and aspects of electronic design. The primary challenge in the undertaking was the tremendous heterogeneity of complex cyber-physical systems (CPS), where such as vehicles has not yet been achieved. This paper describes some of the challenges addressed and solution approaches to building a comprehensive design tool suite for complex CPS.

    And 179 more