IJCAI Workshop on Information Integration on the Web, 2003
In this paper we present the monotonicity principle, a sufficient condition to ensure that exact ... more In this paper we present the monotonicity principle, a sufficient condition to ensure that exact mapping, a mapping as would be performed by a human observer, is ranked close to the best mapping, as generated automatically by a matching algorithm. The research is motivated by the introduction of the semantic Web vision and the shift towards machine understandable Web resources. We support the importance of the monotonicity principle by empirical analysis of a matching algorithm, showing that algorithms that obey ...
International Joint Conference on Artificial Intelligence, 2003
In this paper we present the monotonicityprinciple, a sufficient condition to ensure that exact m... more In this paper we present the monotonicityprinciple, a sufficient condition to ensure that exact mapping, a mapping as would be performed by a human ob- server, is ranked close to the best mapping, as gen- erated automatically by a matching algorithm. The research is motivated by the introduction of the se- mantic Web vision and the shift towards machine understandable
The introduction of the Semantic Web vision and the shift toward machine understandable Web resou... more The introduction of the Semantic Web vision and the shift toward machine understandable Web resources has unearthed the importance of automatic semantic reconciliation. Consequently, new tools for automating the process were proposed. In this work we present a formal model of semantic reconciliation and analyze in a systematic manner the properties of the process outcome, primarily the inherent uncertainty of the matching process and how it reflects on the resulting mappings. An important feature of this research is the identification and analysis of factors that impact the effectiveness of algorithms for automatic semantic reconciliation, leading, it is hoped, to the design of better algorithms by reducing the uncertainty of existing algorithms. Against this background we empirically study the aptitude of two algorithms to correctly match concepts. This research is both timely and practical in light of recent attempts to develop and utilize methods for automatic semantic reconciliation.
The paper provides a conceptual framework for designing and executing business processes using se... more The paper provides a conceptual framework for designing and executing business processes using semantic Web services. We envision a world in which a designer defines a “virtual“ Web service as part of a business process, while requiring the system to seek actual Web services that match the specifi-cations of the designer and can be invoked whenever the virtual Web service is activated. Taking a conceptual modeling approach, the relationships between ontology concepts and syntactic Web services are identified. We then propose a generic algorithm for ranking top-K Web services in a decreasing order of their benefit vis-a-vis the semantic Web service. We conclude with an extention of the ´ framework to handle uncertainty as a result of concept mismatch and the desired properties of a schema matching algorithm to support Web service identification.
IJCAI Workshop on Information Integration on the Web, 2003
In this paper we present the monotonicity principle, a sufficient condition to ensure that exact ... more In this paper we present the monotonicity principle, a sufficient condition to ensure that exact mapping, a mapping as would be performed by a human observer, is ranked close to the best mapping, as generated automatically by a matching algorithm. The research is motivated by the introduction of the semantic Web vision and the shift towards machine understandable Web resources. We support the importance of the monotonicity principle by empirical analysis of a matching algorithm, showing that algorithms that obey ...
International Joint Conference on Artificial Intelligence, 2003
In this paper we present the monotonicityprinciple, a sufficient condition to ensure that exact m... more In this paper we present the monotonicityprinciple, a sufficient condition to ensure that exact mapping, a mapping as would be performed by a human ob- server, is ranked close to the best mapping, as gen- erated automatically by a matching algorithm. The research is motivated by the introduction of the se- mantic Web vision and the shift towards machine understandable
The introduction of the Semantic Web vision and the shift toward machine understandable Web resou... more The introduction of the Semantic Web vision and the shift toward machine understandable Web resources has unearthed the importance of automatic semantic reconciliation. Consequently, new tools for automating the process were proposed. In this work we present a formal model of semantic reconciliation and analyze in a systematic manner the properties of the process outcome, primarily the inherent uncertainty of the matching process and how it reflects on the resulting mappings. An important feature of this research is the identification and analysis of factors that impact the effectiveness of algorithms for automatic semantic reconciliation, leading, it is hoped, to the design of better algorithms by reducing the uncertainty of existing algorithms. Against this background we empirically study the aptitude of two algorithms to correctly match concepts. This research is both timely and practical in light of recent attempts to develop and utilize methods for automatic semantic reconciliation.
The paper provides a conceptual framework for designing and executing business processes using se... more The paper provides a conceptual framework for designing and executing business processes using semantic Web services. We envision a world in which a designer defines a “virtual“ Web service as part of a business process, while requiring the system to seek actual Web services that match the specifi-cations of the designer and can be invoked whenever the virtual Web service is activated. Taking a conceptual modeling approach, the relationships between ontology concepts and syntactic Web services are identified. We then propose a generic algorithm for ranking top-K Web services in a decreasing order of their benefit vis-a-vis the semantic Web service. We conclude with an extention of the ´ framework to handle uncertainty as a result of concept mismatch and the desired properties of a schema matching algorithm to support Web service identification.
Uploads
Papers by Avigdor Gal
while requiring the system to seek actual Web services that match the specifi-cations of the designer and can be invoked whenever the virtual Web service is activated. Taking a conceptual modeling approach, the relationships between
ontology concepts and syntactic Web services are identified. We then propose a generic algorithm for ranking top-K Web services in a decreasing order of their benefit vis-a-vis the semantic Web service. We conclude with an extention of the ´
framework to handle uncertainty as a result of concept mismatch and the desired properties of a schema matching algorithm to support Web service identification.
while requiring the system to seek actual Web services that match the specifi-cations of the designer and can be invoked whenever the virtual Web service is activated. Taking a conceptual modeling approach, the relationships between
ontology concepts and syntactic Web services are identified. We then propose a generic algorithm for ranking top-K Web services in a decreasing order of their benefit vis-a-vis the semantic Web service. We conclude with an extention of the ´
framework to handle uncertainty as a result of concept mismatch and the desired properties of a schema matching algorithm to support Web service identification.