A transient mathematical model of a packed column used to saturate gas has been developed. Predic... more A transient mathematical model of a packed column used to saturate gas has been developed. Predictions from this model are compared with steadystate and transient experimental data from a pilot-plant-sized saturator. Agreement is excellent. Saturator transient response to changes in exit-gas temperature are found to be extremely fast. 9 figures.
Journal of Systemics, Cybernetics and Informatics, 2004
Intelligent, adaptive interfaces are a pre-requisite to elevating computer-based applications to ... more Intelligent, adaptive interfaces are a pre-requisite to elevating computer-based applications to the realm of collaborative decision support in complex, relatively open-ended domains such as logistics and planning. This is because the composition and effective presentation of even the most useful information must be tailored to constantly changing circumstances. Our objective is to not only achieve an adaptive human-machine interface, but to imbue the software with a significant portion of the responsibility for effectively controlling the adaptation, freeing the user from unnecessary distraction and making the human-machine relationship more collaborative in nature. The foundational concepts of interface adaptation are discussed and a specific logistics application is described as an example.
Adaptive Work-Centered User Interface Technology (ACUITy) is an ontology-based approach to modeli... more Adaptive Work-Centered User Interface Technology (ACUITy) is an ontology-based approach to modeling and implementing intelligent user interfaces built on top of Jena. Its potential benefits and underlying technologies are explored in the context of decision support systems. 1
To fully achieve the benefits of smart grid, a range of new software applications, components, an... more To fully achieve the benefits of smart grid, a range of new software applications, components, and improvements to business processes will rely on information emanating from existing and new systems and data sources. These new smart software components will need to interpret business semantics in a common way in order to ensure that data can be exchanged and shared, and that business intelligent activities can be carried out in an efficient and cost effective manner. An open, and shared information model provides such common semantics. An architecture driven by this information model will allow for reduced integration costs, increased development efficiency, and increased overall system flexibility. It also allows for new application functionality not possible given traditional architectural
Adaptive Work-Centered User Interface Technology (ACUITy) is an ontology-based approach to modeli... more Adaptive Work-Centered User Interface Technology (ACUITy) is an ontology-based approach to modeling and implementing intelligent user interfaces built on top of Jena. Its potential benefits and underlying technologies are explored in the context of decision support systems. 1
an open-source framework and architecture for developing semanticallyenabled mixed initiative use... more an open-source framework and architecture for developing semanticallyenabled mixed initiative user interfaces. ACUITy begins to answer some of the challenges presented in the design of a Semantic Web Browser, which we believe should collaborate with the user to determine how to support his or her work by using services available to find, retrieve, visualize and explore resources on the Semantic Web. The browser would also learn from its experience and use that learned knowledge to engage more effectively in future collaborations. To illustrate how such a browser might be developed, we present the ACUITy Editor as both a general example of the type of application that could be developed as well as the means through which an adaptive Semantic Web Browser could be achieved. 1
Abstract. Work domains such as maintenance and logistics planning are char-acterized by open-ende... more Abstract. Work domains such as maintenance and logistics planning are char-acterized by open-ended problem solving and large quantities of heterogeneous and distributed information. Problem-solvers in these domains can benefit from semantic web applications for work-centric decision support. In this pa-per we describe the need for such technology and the missing links in the cur-rent state of the art. We also present Adaptive Work-Centered User Interface Technology (ACUITy), still in its formative stage, as a way to meet this need. 1
Adaptive Work-Centered User Interface Technology (ACUITy) is an open-source framework and archite... more Adaptive Work-Centered User Interface Technology (ACUITy) is an open-source framework and architecture for developing semanticallyenabled mixed initiative user interfaces. ACUITy begins to answer some of the challenges presented in the design of a Semantic Web Browser, which we believe should collaborate with the user to determine how to support his or her work by using services available to find, retrieve, visualize and explore resources on the Semantic Web. The browser would also learn from its experience and use that learned knowledge to engage more effectively in future collaborations. To illustrate how such a browser might be developed, we present the ACUITy Editor as both a general example of the type of application that could be developed as well as the means through which an adaptive Semantic Web Browser could be achieved.
We describe aframeworkfor capturing Data Provenance information to support Information Assurance ... more We describe aframeworkfor capturing Data Provenance information to support Information Assurance attributes like Availability, Authentication, Confidentiality, Integrity and Non-Repudiation. Our approach is applicable to Multi-Level Secure systems where it is not always possible to directly provide data source and data transformation information. We achieve this by combining the subjective and objective trust in data as a "FigureofMerit" value that can cross security boundaries. Our architecture captures the Data Provenance information around the 'invariant'part ofa message in an XML-based SOA architecture. We also introduce the notion of 'wrappers' so that Data Provenance can be added on while minimizing impact to an existing workflow. We outline a simulation-basedframeworkthat allows us to inject faults to model various threats and attacks. We also discuss a dashboard view of a workflow that brings together the intrinsic Information Assurance attributes o...
The ADVISE modeling formalism provides a quantifiable, executable model of adversary behavior for... more The ADVISE modeling formalism provides a quantifiable, executable model of adversary behavior for studying the security of complex systems. Each ADVISE model consists of an attack execution graph that defines all attack steps that can be performed by an adversary and how the performance of those attack steps changes the system, as well as an adversary profile that specifies the interests and capabilities of an adversary, which drives the path the attacker chooses to pursue. In order to greatly simplify and improve modeling with the ADVISE formalism, we propose an ADVISE meta model for specifying a system, a set of adversaries, and a set of metrics that can be used to generate an ADVISE model containing a complete set of attack steps for the system. This paper introduces Web Ontology Language libraries, which are used to build ADVISE meta models and automatically generate attack execution graphs.
The condensing steam heaters dynamically modeled in this work are shell and tube type heat exchan... more The condensing steam heaters dynamically modeled in this work are shell and tube type heat exchangers with the condensing steam on the shell side. Two variations of the model are developed, one for heating a gas stream where the thermal mass of the gas is very small, and one for heating a liquid stream where the thermal mass of the heated fluid is appreciable. Predictions from the gas heater model compare excellently with data collected from transient experiments on two industrial heat exchangers at the General Electric Coal Gasification Process Evaluation Facility. These two heat exchangers differ widely in their design and operating conditions. Consequently, the good agreement that was found between the model predictions and experimental data show the wide applicability of the model.
2018 IEEE 12th International Conference on Semantic Computing (ICSC), 2018
Formal ontology and rule-based approaches founded on semantic technologies have been gaining trac... more Formal ontology and rule-based approaches founded on semantic technologies have been gaining traction in capturing domain knowledge in numerous fields. This typically requires several iterations between a Subject Matter Expert (SME) and a modeling expert to capture domain knowledge; resulting in a time-consuming process. Recent advances in formal Controlled Natural Languages (CNL) improve the process as the SME can more easily evaluate and verify the captured domain knowledge. Such executable domain rules are usually represented in some encoding / representation language and typically use variables. This use of variables does not correspond to how we naturally think about a domain and hence the difficulty that SMEs encounter in directly authoring domain rules. Our goal is to enable the SMEs to directly author the rules in a formal yet English-like language. For this purpose, we introduce the notion of Concept Rules (Crules) where the domain rules are captured in terms of concepts an...
Smart Grid security is challenging as experts in both IT Security and ICS [Industrial Control Sys... more Smart Grid security is challenging as experts in both IT Security and ICS [Industrial Control Systems] systems are few. Expertise in multiple domains is needed, and tools that can be used to analyze smart grid systems during the design phase are non-existent. We used Semantic Web Technology to create an ontology that is capable of reasoning about security attributes. We used SADL (Semantic Application Design Language) to describe information about a smart grid system such as network topology, device specifications, and site-specific information. SADL uses English-like statements to build a model that is understandable by domain experts without requiring knowledge of Semantic Web technology. We describe components of the ontology that are capable of describing, measuring, and comparing both physical and network-specific risks. We also developed a GUI that displayed the results in a Failure Mode Effects Analysis, where threats are prioritized by Likelihood, Detectability, and Severity.
Building secure, complex systems is a daunting task. The ADversary VIew Security Evaluation (ADVI... more Building secure, complex systems is a daunting task. The ADversary VIew Security Evaluation (ADVISE) formalism was designed to offer a model of an adversary attacking a system. As currently implemented in Möbius, ADVISE provides a rich and flexible system security model that, with the other features of Möbius, offers quantitative security metrics. For large systems, constructing realistic ADVISE models can be tedious and impractical. To remedy this issue, we propose the ADVISE meta modeling formalism. An ADVISE meta model is used, with the Möbius framework, to generate ADVISE models and other Möbius components from a higher level model constructed from components, adversaries, and metrics provided by associated Web Ontology Language libraries. This paper briefly reviews Möbius and ADVISE, then introduces the ADVISE meta modeling formalism.
The proliferation of Web Services is fostering the need for applications to provide more personal... more The proliferation of Web Services is fostering the need for applications to provide more personalisation during the service discovery and composition phases. An application has to cater for different types of users and seamlessly provide suitably understandable and refined replies. In this paper, we describe the motivating details behind PreDiCtS1, a framework for personalised service discovery and composition. The underlying concept behind PreDiCtS is that, similar service composition problems could be tackled in a similar manner by reusing past composition best practices. These have to be useful and at the same time flexible enough to allow for adaptations to new problems. For this reason we are opting to use template-based composition information. PreDiCtS’s retrieval and refinement technique is based on conversational case-based reasoning (CCBR) and makes use of a core OWL ontology called CCBROnto for case representations.
ASME 1992 International Computers in Engineering Conference: Volume 1 — Artificial Intelligence; Expert Systems; CAD/CAM/CAE; Computers in Fluid Mechanics/Thermal Systems
Today many excellent software packages are available both within GE and commercially. The challen... more Today many excellent software packages are available both within GE and commercially. The challenge of the future is to integrate suites of these diverse packages to provide the best solution for a given application’s requirements. This paper will describe how the GEN-X expert system shell has been enhanced to execute as a cooperative program using a client-server model in multiprocessing environments that support this concept, i.e. Windows, UNIX, etc. Consistent with the GEN-X philosophy of providing ease of use, the knowledge base developer is shielded from the details of the implementation. The communication facilities allow commands and data to be exchanged with other processes by supporting multiple protocols and transport mechanisms. The initial Unix implementation used GDE, a protocol similar to Microsoft’s DDE under Windows (Microsoft, 1990). This paper describes how the addition of interprocess communication facilities to GEN-X allows powerful tools such as hypermedia manag...
2017 IEEE 25th International Requirements Engineering Conference (RE)
Capturing high-level requirements in a human readable but formal representation suitable for anal... more Capturing high-level requirements in a human readable but formal representation suitable for analysis is an important goal for GE. To that end we have augmented an existing controlled-English modeling language with a new controlled-English requirements capture language to create the Requirements Capture frontend of the ASSERT(TM) tool suite. Requirements captured in ASSERT can be analyzed for a number of possible shortcomings, both individually and collectively. Once a set of requirements has reached a satisfactory level of completeness, consistency, etc., it can then be further used to generate test cases and test procedures. This paper will focus on the requirements capture and analysis functions of ASSERT and will illustrate its capabilities with a sample problem previously used as a challenge problem for requirements specification.
2019 IEEE 13th International Conference on Semantic Computing (ICSC)
Our experience at GE Research suggests that the use of a controlled-English grammar and a rich au... more Our experience at GE Research suggests that the use of a controlled-English grammar and a rich authoring environment can greatly facilitate subject matter experts' ability to understand, create, and collaboratively employ models. A domain ontology is an ideal foundation for many advanced capabilities. An example is extending our controlled-English grammar and authoring environment for OWL model generation to allow the capture of high-level requirements, assumptions, and assertions, enabling requirement engineers to create models of system capability and behavior amenable to formal methods analysis to detect incompleteness, conflict, and a variety of other issues. The same domain models and formal requirements can be used to automatically generate test cases and test procedures. Automated test generation represents a huge reduction in the time and effort required to create and validate critical software. In this paper we illustrate how ontologies enable the ASSERT™ tool suite to support the above capabilities through a small grounding use case.
2020 IEEE International Conference on Big Data (Big Data)
To deliver business value, most data-driven enterprises and applications require data to be extra... more To deliver business value, most data-driven enterprises and applications require data to be extracted and merged from otherwise siloed data storage platforms. FDC Cache has been designed and developed to enable the fusion and caching of data drawn from multiple small and/or Big Data stores. This capability executes a sequence of queries, wherein the results from one query may be used to constrain subsequent queries. The results of each query are linked with results from previous queries, incrementally building a cache of semantically linked data that can be used to support multiple independent data requests. FDC Cache uses Semantic Web technologies, and knowledge graphs in particular, to describe the relevant data and relationships in a computable model. This enables applications to reason over the graph, for example to dynamically retrieve targeted subsets of data comprised of previously disparate information. We have successfully applied FDC Cache to two distinct industrial use cases: (i) merging data across multiple sources to assemble information about current parts in a gas turbine, and (ii) dynamically aligning siloed data from electric grid transmission and distribution networks to an industry-standard common model, in which the cache creation time has been shown to scale sub-linearly with the number of data elements. FDC Cache has been open-sourced as part of the GE-developed open source Semantics Toolkit.
A transient mathematical model of a packed column used to saturate gas has been developed. Predic... more A transient mathematical model of a packed column used to saturate gas has been developed. Predictions from this model are compared with steadystate and transient experimental data from a pilot-plant-sized saturator. Agreement is excellent. Saturator transient response to changes in exit-gas temperature are found to be extremely fast. 9 figures.
Journal of Systemics, Cybernetics and Informatics, 2004
Intelligent, adaptive interfaces are a pre-requisite to elevating computer-based applications to ... more Intelligent, adaptive interfaces are a pre-requisite to elevating computer-based applications to the realm of collaborative decision support in complex, relatively open-ended domains such as logistics and planning. This is because the composition and effective presentation of even the most useful information must be tailored to constantly changing circumstances. Our objective is to not only achieve an adaptive human-machine interface, but to imbue the software with a significant portion of the responsibility for effectively controlling the adaptation, freeing the user from unnecessary distraction and making the human-machine relationship more collaborative in nature. The foundational concepts of interface adaptation are discussed and a specific logistics application is described as an example.
Adaptive Work-Centered User Interface Technology (ACUITy) is an ontology-based approach to modeli... more Adaptive Work-Centered User Interface Technology (ACUITy) is an ontology-based approach to modeling and implementing intelligent user interfaces built on top of Jena. Its potential benefits and underlying technologies are explored in the context of decision support systems. 1
To fully achieve the benefits of smart grid, a range of new software applications, components, an... more To fully achieve the benefits of smart grid, a range of new software applications, components, and improvements to business processes will rely on information emanating from existing and new systems and data sources. These new smart software components will need to interpret business semantics in a common way in order to ensure that data can be exchanged and shared, and that business intelligent activities can be carried out in an efficient and cost effective manner. An open, and shared information model provides such common semantics. An architecture driven by this information model will allow for reduced integration costs, increased development efficiency, and increased overall system flexibility. It also allows for new application functionality not possible given traditional architectural
Adaptive Work-Centered User Interface Technology (ACUITy) is an ontology-based approach to modeli... more Adaptive Work-Centered User Interface Technology (ACUITy) is an ontology-based approach to modeling and implementing intelligent user interfaces built on top of Jena. Its potential benefits and underlying technologies are explored in the context of decision support systems. 1
an open-source framework and architecture for developing semanticallyenabled mixed initiative use... more an open-source framework and architecture for developing semanticallyenabled mixed initiative user interfaces. ACUITy begins to answer some of the challenges presented in the design of a Semantic Web Browser, which we believe should collaborate with the user to determine how to support his or her work by using services available to find, retrieve, visualize and explore resources on the Semantic Web. The browser would also learn from its experience and use that learned knowledge to engage more effectively in future collaborations. To illustrate how such a browser might be developed, we present the ACUITy Editor as both a general example of the type of application that could be developed as well as the means through which an adaptive Semantic Web Browser could be achieved. 1
Abstract. Work domains such as maintenance and logistics planning are char-acterized by open-ende... more Abstract. Work domains such as maintenance and logistics planning are char-acterized by open-ended problem solving and large quantities of heterogeneous and distributed information. Problem-solvers in these domains can benefit from semantic web applications for work-centric decision support. In this pa-per we describe the need for such technology and the missing links in the cur-rent state of the art. We also present Adaptive Work-Centered User Interface Technology (ACUITy), still in its formative stage, as a way to meet this need. 1
Adaptive Work-Centered User Interface Technology (ACUITy) is an open-source framework and archite... more Adaptive Work-Centered User Interface Technology (ACUITy) is an open-source framework and architecture for developing semanticallyenabled mixed initiative user interfaces. ACUITy begins to answer some of the challenges presented in the design of a Semantic Web Browser, which we believe should collaborate with the user to determine how to support his or her work by using services available to find, retrieve, visualize and explore resources on the Semantic Web. The browser would also learn from its experience and use that learned knowledge to engage more effectively in future collaborations. To illustrate how such a browser might be developed, we present the ACUITy Editor as both a general example of the type of application that could be developed as well as the means through which an adaptive Semantic Web Browser could be achieved.
We describe aframeworkfor capturing Data Provenance information to support Information Assurance ... more We describe aframeworkfor capturing Data Provenance information to support Information Assurance attributes like Availability, Authentication, Confidentiality, Integrity and Non-Repudiation. Our approach is applicable to Multi-Level Secure systems where it is not always possible to directly provide data source and data transformation information. We achieve this by combining the subjective and objective trust in data as a "FigureofMerit" value that can cross security boundaries. Our architecture captures the Data Provenance information around the 'invariant'part ofa message in an XML-based SOA architecture. We also introduce the notion of 'wrappers' so that Data Provenance can be added on while minimizing impact to an existing workflow. We outline a simulation-basedframeworkthat allows us to inject faults to model various threats and attacks. We also discuss a dashboard view of a workflow that brings together the intrinsic Information Assurance attributes o...
The ADVISE modeling formalism provides a quantifiable, executable model of adversary behavior for... more The ADVISE modeling formalism provides a quantifiable, executable model of adversary behavior for studying the security of complex systems. Each ADVISE model consists of an attack execution graph that defines all attack steps that can be performed by an adversary and how the performance of those attack steps changes the system, as well as an adversary profile that specifies the interests and capabilities of an adversary, which drives the path the attacker chooses to pursue. In order to greatly simplify and improve modeling with the ADVISE formalism, we propose an ADVISE meta model for specifying a system, a set of adversaries, and a set of metrics that can be used to generate an ADVISE model containing a complete set of attack steps for the system. This paper introduces Web Ontology Language libraries, which are used to build ADVISE meta models and automatically generate attack execution graphs.
The condensing steam heaters dynamically modeled in this work are shell and tube type heat exchan... more The condensing steam heaters dynamically modeled in this work are shell and tube type heat exchangers with the condensing steam on the shell side. Two variations of the model are developed, one for heating a gas stream where the thermal mass of the gas is very small, and one for heating a liquid stream where the thermal mass of the heated fluid is appreciable. Predictions from the gas heater model compare excellently with data collected from transient experiments on two industrial heat exchangers at the General Electric Coal Gasification Process Evaluation Facility. These two heat exchangers differ widely in their design and operating conditions. Consequently, the good agreement that was found between the model predictions and experimental data show the wide applicability of the model.
2018 IEEE 12th International Conference on Semantic Computing (ICSC), 2018
Formal ontology and rule-based approaches founded on semantic technologies have been gaining trac... more Formal ontology and rule-based approaches founded on semantic technologies have been gaining traction in capturing domain knowledge in numerous fields. This typically requires several iterations between a Subject Matter Expert (SME) and a modeling expert to capture domain knowledge; resulting in a time-consuming process. Recent advances in formal Controlled Natural Languages (CNL) improve the process as the SME can more easily evaluate and verify the captured domain knowledge. Such executable domain rules are usually represented in some encoding / representation language and typically use variables. This use of variables does not correspond to how we naturally think about a domain and hence the difficulty that SMEs encounter in directly authoring domain rules. Our goal is to enable the SMEs to directly author the rules in a formal yet English-like language. For this purpose, we introduce the notion of Concept Rules (Crules) where the domain rules are captured in terms of concepts an...
Smart Grid security is challenging as experts in both IT Security and ICS [Industrial Control Sys... more Smart Grid security is challenging as experts in both IT Security and ICS [Industrial Control Systems] systems are few. Expertise in multiple domains is needed, and tools that can be used to analyze smart grid systems during the design phase are non-existent. We used Semantic Web Technology to create an ontology that is capable of reasoning about security attributes. We used SADL (Semantic Application Design Language) to describe information about a smart grid system such as network topology, device specifications, and site-specific information. SADL uses English-like statements to build a model that is understandable by domain experts without requiring knowledge of Semantic Web technology. We describe components of the ontology that are capable of describing, measuring, and comparing both physical and network-specific risks. We also developed a GUI that displayed the results in a Failure Mode Effects Analysis, where threats are prioritized by Likelihood, Detectability, and Severity.
Building secure, complex systems is a daunting task. The ADversary VIew Security Evaluation (ADVI... more Building secure, complex systems is a daunting task. The ADversary VIew Security Evaluation (ADVISE) formalism was designed to offer a model of an adversary attacking a system. As currently implemented in Möbius, ADVISE provides a rich and flexible system security model that, with the other features of Möbius, offers quantitative security metrics. For large systems, constructing realistic ADVISE models can be tedious and impractical. To remedy this issue, we propose the ADVISE meta modeling formalism. An ADVISE meta model is used, with the Möbius framework, to generate ADVISE models and other Möbius components from a higher level model constructed from components, adversaries, and metrics provided by associated Web Ontology Language libraries. This paper briefly reviews Möbius and ADVISE, then introduces the ADVISE meta modeling formalism.
The proliferation of Web Services is fostering the need for applications to provide more personal... more The proliferation of Web Services is fostering the need for applications to provide more personalisation during the service discovery and composition phases. An application has to cater for different types of users and seamlessly provide suitably understandable and refined replies. In this paper, we describe the motivating details behind PreDiCtS1, a framework for personalised service discovery and composition. The underlying concept behind PreDiCtS is that, similar service composition problems could be tackled in a similar manner by reusing past composition best practices. These have to be useful and at the same time flexible enough to allow for adaptations to new problems. For this reason we are opting to use template-based composition information. PreDiCtS’s retrieval and refinement technique is based on conversational case-based reasoning (CCBR) and makes use of a core OWL ontology called CCBROnto for case representations.
ASME 1992 International Computers in Engineering Conference: Volume 1 — Artificial Intelligence; Expert Systems; CAD/CAM/CAE; Computers in Fluid Mechanics/Thermal Systems
Today many excellent software packages are available both within GE and commercially. The challen... more Today many excellent software packages are available both within GE and commercially. The challenge of the future is to integrate suites of these diverse packages to provide the best solution for a given application’s requirements. This paper will describe how the GEN-X expert system shell has been enhanced to execute as a cooperative program using a client-server model in multiprocessing environments that support this concept, i.e. Windows, UNIX, etc. Consistent with the GEN-X philosophy of providing ease of use, the knowledge base developer is shielded from the details of the implementation. The communication facilities allow commands and data to be exchanged with other processes by supporting multiple protocols and transport mechanisms. The initial Unix implementation used GDE, a protocol similar to Microsoft’s DDE under Windows (Microsoft, 1990). This paper describes how the addition of interprocess communication facilities to GEN-X allows powerful tools such as hypermedia manag...
2017 IEEE 25th International Requirements Engineering Conference (RE)
Capturing high-level requirements in a human readable but formal representation suitable for anal... more Capturing high-level requirements in a human readable but formal representation suitable for analysis is an important goal for GE. To that end we have augmented an existing controlled-English modeling language with a new controlled-English requirements capture language to create the Requirements Capture frontend of the ASSERT(TM) tool suite. Requirements captured in ASSERT can be analyzed for a number of possible shortcomings, both individually and collectively. Once a set of requirements has reached a satisfactory level of completeness, consistency, etc., it can then be further used to generate test cases and test procedures. This paper will focus on the requirements capture and analysis functions of ASSERT and will illustrate its capabilities with a sample problem previously used as a challenge problem for requirements specification.
2019 IEEE 13th International Conference on Semantic Computing (ICSC)
Our experience at GE Research suggests that the use of a controlled-English grammar and a rich au... more Our experience at GE Research suggests that the use of a controlled-English grammar and a rich authoring environment can greatly facilitate subject matter experts' ability to understand, create, and collaboratively employ models. A domain ontology is an ideal foundation for many advanced capabilities. An example is extending our controlled-English grammar and authoring environment for OWL model generation to allow the capture of high-level requirements, assumptions, and assertions, enabling requirement engineers to create models of system capability and behavior amenable to formal methods analysis to detect incompleteness, conflict, and a variety of other issues. The same domain models and formal requirements can be used to automatically generate test cases and test procedures. Automated test generation represents a huge reduction in the time and effort required to create and validate critical software. In this paper we illustrate how ontologies enable the ASSERT™ tool suite to support the above capabilities through a small grounding use case.
2020 IEEE International Conference on Big Data (Big Data)
To deliver business value, most data-driven enterprises and applications require data to be extra... more To deliver business value, most data-driven enterprises and applications require data to be extracted and merged from otherwise siloed data storage platforms. FDC Cache has been designed and developed to enable the fusion and caching of data drawn from multiple small and/or Big Data stores. This capability executes a sequence of queries, wherein the results from one query may be used to constrain subsequent queries. The results of each query are linked with results from previous queries, incrementally building a cache of semantically linked data that can be used to support multiple independent data requests. FDC Cache uses Semantic Web technologies, and knowledge graphs in particular, to describe the relevant data and relationships in a computable model. This enables applications to reason over the graph, for example to dynamically retrieve targeted subsets of data comprised of previously disparate information. We have successfully applied FDC Cache to two distinct industrial use cases: (i) merging data across multiple sources to assemble information about current parts in a gas turbine, and (ii) dynamically aligning siloed data from electric grid transmission and distribution networks to an industry-standard common model, in which the cache creation time has been shown to scale sub-linearly with the number of data elements. FDC Cache has been open-sourced as part of the GE-developed open source Semantics Toolkit.
Uploads
Papers by Andrew Crapo