Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (124)

Search Parameters:
Keywords = XML

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
15 pages, 1106 KiB  
Article
GPU@SAT DevKit: Empowering Edge Computing Development Onboard Satellites in the Space-IoT Era
by Gionata Benelli, Giovanni Todaro, Matteo Monopoli, Gianluca Giuffrida, Massimiliano Donati and Luca Fanucci
Electronics 2024, 13(19), 3928; https://doi.org/10.3390/electronics13193928 - 4 Oct 2024
Abstract
Advancements in technology have driven the miniaturization of embedded systems, making them more cost-effective and energy-efficient for wireless applications. As a result, the number of connectable devices in Internet of Things (IoT) networks has increased significantly, creating the challenge of linking them effectively [...] Read more.
Advancements in technology have driven the miniaturization of embedded systems, making them more cost-effective and energy-efficient for wireless applications. As a result, the number of connectable devices in Internet of Things (IoT) networks has increased significantly, creating the challenge of linking them effectively and economically. The space industry has long recognized this challenge and invested in satellite infrastructure for IoT networks, exploiting the potential of edge computing technologies. In this context, it is of critical importance to enhance the onboard computing capabilities of satellites and develop enabling technologies for their advancement. This is necessary to ensure that satellites are able to connect devices while reducing latency, bandwidth utilization, and development costs, and improving privacy and security measures. This paper presents the GPU@SAT DevKit: an ecosystem for testing a high-performance, general-purpose accelerator designed for FPGAs and suitable for edge computing tasks on satellites. This ecosystem provides a streamlined way to exploit GPGPU processing in space, enabling faster development times and more efficient resource use. Designed for FPGAs and tailored to edge computing tasks, the GPU@SAT accelerator mimics the parallel architecture of a GPU, allowing developers to leverage its capabilities while maintaining flexibility. Its compatibility with OpenCL simplifies the development process, enabling faster deployment of satellite-based applications. The DevKit was implemented and tested on a Zynq UltraScale+ MPSoC evaluation board from Xilinx, integrating the GPU@SAT IP core with the system’s embedded processor. A client/server approach is used to run applications, allowing users to easily configure and execute kernels through a simple XML document. This intuitive interface provides end-users with the ability to run and evaluate kernel performance and functionality without dealing with the underlying complexities of the accelerator itself. By making the GPU@SAT IP core more accessible, the DevKit significantly reduces development time and lowers the barrier to entry for satellite-based edge computing solutions. The DevKit was also compared with other onboard processing solutions, demonstrating similar performance. Full article
Show Figures

Figure 1

23 pages, 10171 KiB  
Article
Multidisciplinary Reliability Design Optimization Modeling Based on SysML
by Qiang Zhang, Jihong Liu and Xu Chen
Appl. Sci. 2024, 14(17), 7558; https://doi.org/10.3390/app14177558 - 27 Aug 2024
Viewed by 424
Abstract
Model-Based Systems Engineering (MBSE) supports the system-level design of complex products effectively. Currently, system design and optimization for complex products are two distinct processes that must be executed using different software or platforms, involving intricate data conversion processes. Applying multidisciplinary optimization to validate [...] Read more.
Model-Based Systems Engineering (MBSE) supports the system-level design of complex products effectively. Currently, system design and optimization for complex products are two distinct processes that must be executed using different software or platforms, involving intricate data conversion processes. Applying multidisciplinary optimization to validate system optimization often necessitates remodeling the optimization objects, which is time-consuming, labor-intensive, and highly error-prone. A critical activity in systems engineering is identifying the optimal design solution for the entire system. Multidisciplinary Design Optimization (MDO) and reliability analysis are essential methods for achieving this. This paper proposes a SysML-based multidisciplinary reliability design optimization modeling method. First, by analyzing the definitions and mathematical models of multidisciplinary reliability design optimization, the SysML extension mechanism is employed to represent the optimization model based on SysML. Next, model transformation techniques are used to convert the SysML optimization model generated in the first stage into an XML description model readable by optimization solvers. Finally, the proposed method’s effectiveness is validated through an engineering case study of an in-vehicle environmental control integration system. The results demonstrate that this method fully utilizes SysML to express MDO problems, enhancing the efficiency of design optimization for complex systems. Engineers and system designers working on complex, multidisciplinary projects can particularly benefit from these advancements, as they simplify the integration of design and optimization processes, facilitating more reliable and efficient product development. Full article
Show Figures

Figure 1

21 pages, 4493 KiB  
Article
Formal Language for Objects’ Transactions
by Mo Adda
Standards 2024, 4(3), 133-153; https://doi.org/10.3390/standards4030008 - 15 Aug 2024
Viewed by 440
Abstract
The gap between software design and implementation often results in a lack of clarity and precision. Formal languages, based on mathematical rules, logic, and symbols, are invaluable for specifying and verifying system designs. Various semi-formal and formal languages, such as JSON, XML, predicate [...] Read more.
The gap between software design and implementation often results in a lack of clarity and precision. Formal languages, based on mathematical rules, logic, and symbols, are invaluable for specifying and verifying system designs. Various semi-formal and formal languages, such as JSON, XML, predicate logic, and regular expressions, along with formal models like Turing machines, serve specific domains. This paper introduces a new specification formal language, ObTFL (Object Transaction Formal Language), developed for general-purpose distributed systems, such as specifying the interactions between servers and IoT devices and their security protocols. The paper details the syntax and semantics of ObTFL and presents three real case studies—federated learning, blockchain for crypto and bitcoin networks, and the industrial PCB board with machine synchronization—to demonstrate its versatility and effectiveness in formally specifying the interactions and behaviors of distributed systems. Full article
Show Figures

Figure 1

75 pages, 1896 KiB  
Article
Complete Subhedge Projection for Stepwise Hedge Automata
by Antonio Al Serhali and Joachim Niehren
Algorithms 2024, 17(8), 339; https://doi.org/10.3390/a17080339 - 2 Aug 2024
Viewed by 454
Abstract
We demonstrate how to evaluate stepwise hedge automata (Shas) with subhedge projection while completely projecting irrelevant subhedges. Since this requires passing finite state information top-down, we introduce the notion of downward stepwise hedge automata. We use them to define in-memory and [...] Read more.
We demonstrate how to evaluate stepwise hedge automata (Shas) with subhedge projection while completely projecting irrelevant subhedges. Since this requires passing finite state information top-down, we introduce the notion of downward stepwise hedge automata. We use them to define in-memory and streaming evaluators with complete subhedge projection for Shas. We then tune the evaluators so that they can decide on membership at the earliest time point. We apply our algorithms to the problem of answering regular XPath queries on Xml streams. Our experiments show that complete subhedge projection of Shas can indeed speed up earliest query answering on Xml streams so that it becomes competitive with the best existing streaming tools for XPath queries. Full article
Show Figures

Figure 1

19 pages, 5250 KiB  
Article
Research on Parameter Correction of Distribution Network Model Based on CIM Model and Measurement Data
by Ke Zhou, Lifang Wu, Biyun Zhang, Ruotian Yao, Hao Bai, Weichen Yang and Min Xu
Energies 2024, 17(15), 3611; https://doi.org/10.3390/en17153611 - 23 Jul 2024
Viewed by 477
Abstract
The construction of an energy distribution network can improve the system’s ability to absorb new energy, and its stable and efficient operation has become more and more important. The security and stability analysis of a distribution network needs accurate distribution network model parameters [...] Read more.
The construction of an energy distribution network can improve the system’s ability to absorb new energy, and its stable and efficient operation has become more and more important. The security and stability analysis of a distribution network needs accurate distribution network model parameters as support. At present, the installation of PMU equipment in China’s distribution network synchronous phase angle measurement unit is limited, which brings challenges to the parameter correction of the distribution network. In this paper, an automatic correction algorithm of distribution network parameters based on the CIM model and measurement data is proposed for the distribution network system without PMU. Firstly, the distribution network topology construction technology based on XML files and key fields of the distribution network is proposed, and the non/small impedance devices (such as switches) are merged and reduced. The breadth-first traversal algorithm is used to verify the connectivity of the constructed topology. Then, based on the topology construction and the least squares method, an iterative parameter correction technique is constructed. Finally, the accuracy and effectiveness of the proposed algorithm are verified in a standard IEEE 33-bus system distribution network and an example of the China Southern Power Grid. The topology connections constructed based on the CIM model have significantly enhanced the efficiency of parameter correction. Full article
(This article belongs to the Section F: Electrical Engineering)
Show Figures

Figure 1

18 pages, 721 KiB  
Article
A Packet Content-Oriented Remote Code Execution Attack Payload Detection Model
by Enbo Sun, Jiaxuan Han, Yiquan Li and Cheng Huang
Future Internet 2024, 16(7), 235; https://doi.org/10.3390/fi16070235 - 2 Jul 2024
Viewed by 646
Abstract
In recent years, various Remote Code Execution vulnerabilities on the Internet have been exposed frequently; thus, more and more security researchers have begun to pay attention to the detection of Remote Code Execution attacks. In this paper, we focus on three kinds of [...] Read more.
In recent years, various Remote Code Execution vulnerabilities on the Internet have been exposed frequently; thus, more and more security researchers have begun to pay attention to the detection of Remote Code Execution attacks. In this paper, we focus on three kinds of common Remote Code Execution attacks: XML External Entity, Expression Language Injection, and Insecure Deserialization. We propose a packet content-oriented Remote Code Execution attack payload detection model. For the XML External Entity attack, we propose an algorithm to construct the use-definition chain of XML entities, and implement detection based on the integrity of the chain and the behavior of the chain’s tail node. For the Expression Language Injection and Insecure Deserialization attack, we extract 34 features to represent the string operation and the use of sensitive classes/methods in the code, and then train a machine learning model to implement detection. At the same time, we build a dataset to evaluate the effect of the proposed model. The evaluation results show that the model performs well in detecting XML External Entity attacks, achieving a precision of 0.85 and a recall of 0.94. Similarly, the model performs well in detecting Expression Language Injection and Insecure Deserialization attacks, achieving a precision of 0.99 and a recall of 0.88. Full article
(This article belongs to the Section Cybersecurity)
Show Figures

Figure 1

18 pages, 5509 KiB  
Article
Establishing a Generic Geographic Information Collection Platform for Heterogeneous Data
by Songcao Liu, Bozhao Li, Yuqiao Chen and Zhongliang Cai
Appl. Sci. 2024, 14(13), 5416; https://doi.org/10.3390/app14135416 - 21 Jun 2024
Cited by 1 | Viewed by 598
Abstract
Geographic information collection platforms are widely used for acquiring geographic information. However, existing geographic information collection platforms have limited adaptability and configurability, negatively affecting their usability. They do not support complete field collection workflows or capture data with complex nested structures. To address [...] Read more.
Geographic information collection platforms are widely used for acquiring geographic information. However, existing geographic information collection platforms have limited adaptability and configurability, negatively affecting their usability. They do not support complete field collection workflows or capture data with complex nested structures. To address these limitations, this paper proposes a generic geographic information collection platform based on a comprehensive XML schema definition and a corresponding XML toolkit. This platform includes professional and non-professional versions of collection software, as well as a management system. Users can configure controls and define nested tables within this platform to collect heterogeneous and complex nested data. Moreover, the platform supports functions such as task assignment, local deployment servers, multitasking parallelism, and summary statistics of heterogeneous data, ensuring complete workflow support for field data collection. The platform has been applied in agriculture, forestry, and related fields. This paper uses the agricultural industry structure survey as a case study. Practical applications and our case study show that this platform can reduce software development costs, lower user knowledge prerequisites, and fulfill 95% of geographic information collection scenarios. Full article
(This article belongs to the Special Issue Software Engineering: Computer Science and System—Second Edition)
Show Figures

Figure 1

25 pages, 1951 KiB  
Article
IR.WoT: An Architecture and Vision for a Unified Web of Things Search Engine
by Cristyan Manta-Caro and Juan M. Fernández-Luna
Sensors 2024, 24(11), 3302; https://doi.org/10.3390/s24113302 - 22 May 2024
Viewed by 705
Abstract
The revolution of the Internet of Things (IoT) and the Web of Things (WoT) has brought new opportunities and challenges for the information retrieval (IR) field. The exponential number of interconnected physical objects and real-time data acquisition requires new approaches and architectures for [...] Read more.
The revolution of the Internet of Things (IoT) and the Web of Things (WoT) has brought new opportunities and challenges for the information retrieval (IR) field. The exponential number of interconnected physical objects and real-time data acquisition requires new approaches and architectures for IR systems. Research and prototypes can be crucial in designing and developing new systems and refining architectures for IR in the WoT. This paper proposes a unified and holistic approach for IR in the WoT, called IR.WoT. The proposed system contemplates the critical indexing, scoring, and presentation stages applied to some smart cities’ use cases and scenarios. Overall, this paper describes the research, architecture, and vision for advancing the field of IR in the WoT and addresses some of the remaining challenges and opportunities in this exciting area. The article also describes the design considerations, cloud implementation, and experimentation based on a simulated collection of synthetic XML documents with technical efficiency measures. The experimentation results show promising outcomes, whereas further studies are required to improve IR.WoT effectiveness, considering the WoT dynamic characteristics and, more importantly, the heterogeneity and divergence of WoT modeling proposals in the IR domain. Full article
Show Figures

Figure 1

11 pages, 558 KiB  
Article
Fuzzy Classification Approach to Select Learning Objects Based on Learning Styles in Intelligent E-Learning Systems
by Ibtissam Azzi, Abdelhay Radouane, Loubna Laaouina, Adil Jeghal, Ali Yahyaouy and Hamid Tairi
Informatics 2024, 11(2), 29; https://doi.org/10.3390/informatics11020029 - 15 May 2024
Viewed by 942
Abstract
In e-learning systems, even though the automatic detection of learning styles is considered the key element in the adaptation process, it does not represent the main goal of this process at all. Indeed, to accomplish the task of adaptation, it is also necessary [...] Read more.
In e-learning systems, even though the automatic detection of learning styles is considered the key element in the adaptation process, it does not represent the main goal of this process at all. Indeed, to accomplish the task of adaptation, it is also necessary to be able to automatically select the learning objects according to the detected styles. The classification techniques are the most used techniques to automatically select the learning objects by processing data derived from learning object metadata. By using these classification techniques, considerable results are obtained via several approaches and consist of mapping the learning objects into different teaching strategies and then mapping these strategies into the identified learning styles. However, these approaches have some limitations related to robustness. Indeed, a common feature of these approaches is that they do not directly map learning object metadata elements to learning style dimensions. Moreover, they do not consider the fuzzy nature of learning objects. Indeed, any learning object can be suitable for different learning styles at varying degrees of suitability. This highlights the need to find a way to remedy this shortcoming. Our work is part of the automatic selection of learning objects. So, we will propose an approach that uses the fuzzy classification technique to select learning objects based on learning styles. In this approach, the metadata of each learning object that complies with the Institute of Electrical and Electronics Engineers (IEEE) standard are stored in a database as an Extensible Markup Language (XML) file. The Fuzzy C Means algorithm is used, on one hand, to assign fuzzy suitability rates to the stored learning objects and, on the other hand, to cluster them into the Felder and Silverman learning styles model categories. The experiment results show the performance of our approach. Full article
Show Figures

Figure 1

17 pages, 6019 KiB  
Article
Digital Guardianship: Innovative Strategies in Preserving Armenian’s Epigraphic Legacy
by Hamest Tamrazyan and Gayane Hovhannisyan
Heritage 2024, 7(5), 2296-2312; https://doi.org/10.3390/heritage7050109 - 30 Apr 2024
Viewed by 826
Abstract
In the face of geopolitical threats in Artsakh, the preservation of Armenia’s epigraphic heritage has become a mission of both historical and cultural urgency. This project delves deep into Armenian inscriptions, employing advanced digital tools and strategies like the Oxygen text editor and [...] Read more.
In the face of geopolitical threats in Artsakh, the preservation of Armenia’s epigraphic heritage has become a mission of both historical and cultural urgency. This project delves deep into Armenian inscriptions, employing advanced digital tools and strategies like the Oxygen text editor and EpiDoc guidelines to efficiently catalogue, analyze, and present these historical treasures. Amidst the adversities posed by Azerbaijan’s stance towards Armenian heritage in Artsakh, the digital documentation and preservation of these inscriptions have become a beacon of cultural resilience. The XML-based database ensures consistent data, promoting scholarly research and broadening accessibility. Integrating the Grabar Armenian dictionary addressed linguistic challenges, enhancing data accuracy. This initiative goes beyond merely preserving stone and text; it is a testament to the stories, hopes, and enduring spirit of the Armenian people in the face of external threats. Through a harmonious blend of technology and traditional knowledge, the project stands as a vanguard in the fight to ensure that Armenia’s rich epigraphic legacy, and the narratives they enshrine remain undiminished for future generations. Full article
Show Figures

Figure 1

27 pages, 1266 KiB  
Article
A Meta Algorithm for Interpretable Ensemble Learning: The League of Experts
by Richard Vogel, Tobias Schlosser, Robert Manthey, Marc Ritter, Matthias Vodel, Maximilian Eibl and Kristan Alexander Schneider
Mach. Learn. Knowl. Extr. 2024, 6(2), 800-826; https://doi.org/10.3390/make6020038 - 9 Apr 2024
Viewed by 1484
Abstract
Background. The importance of explainable artificial intelligence and machine learning (XAI/XML) is increasingly being recognized, aiming to understand how information contributes to decisions, the method’s bias, or sensitivity to data pathologies. Efforts are often directed to post hoc explanations [...] Read more.
Background. The importance of explainable artificial intelligence and machine learning (XAI/XML) is increasingly being recognized, aiming to understand how information contributes to decisions, the method’s bias, or sensitivity to data pathologies. Efforts are often directed to post hoc explanations of black box models. These approaches add additional sources for errors without resolving their shortcomings. Less effort is directed into the design of intrinsically interpretable approaches. Methods. We introduce an intrinsically interpretable methodology motivated by ensemble learning: the League of Experts (LoE) model. We establish the theoretical framework first and then deduce a modular meta algorithm. In our description, we focus primarily on classification problems. However, LoE applies equally to regression problems. Specific to classification problems, we employ classical decision trees as classifier ensembles as a particular instance. This choice facilitates the derivation of human-understandable decision rules for the underlying classification problem, which results in a derived rule learning system denoted as RuleLoE. Results. In addition to 12 KEEL classification datasets, we employ two standard datasets from particularly relevant domains—medicine and finance—to illustrate the LoE algorithm. The performance of LoE with respect to its accuracy and rule coverage is comparable to common state-of-the-art classification methods. Moreover, LoE delivers a clearly understandable set of decision rules with adjustable complexity, describing the classification problem. Conclusions. LoE is a reliable method for classification and regression problems with an accuracy that seems to be appropriate for situations in which underlying causalities are in the center of interest rather than just accurate predictions or classifications. Full article
(This article belongs to the Special Issue Advances in Explainable Artificial Intelligence (XAI): 2nd Edition)
Show Figures

Figure 1

19 pages, 2635 KiB  
Article
Man-in-the-Loop Control and Mission Planning for Unmanned Underwater Vehicles
by Mengxue Han, Jialun Wang, Jianya Yuan, Zhao Wang, Dan Yu, Qianqian Zhang and Hongjian Wang
J. Mar. Sci. Eng. 2024, 12(3), 420; https://doi.org/10.3390/jmse12030420 - 27 Feb 2024
Viewed by 1040
Abstract
UUVs (unmanned underwater vehicles) perform tasks in the marine environment under direction from a commander through a mother ship control system. In cases where communication is available, a UUV task re-planning system was designed to ensure task completion despite uncertain events faced by [...] Read more.
UUVs (unmanned underwater vehicles) perform tasks in the marine environment under direction from a commander through a mother ship control system. In cases where communication is available, a UUV task re-planning system was designed to ensure task completion despite uncertain events faced by UUVs. First, the XML language standardizes the expression of UUV task elements. Second, considering the time sequence and spatial path planning requirements of human-supervised UUV control tasks, time sequence planning based on a genetic algorithm and spatial path planning based on an improved genetic algorithm were designed to plan near-optimal approximate spatial paths for control tasks. Third, uncertainties encountered during UUV task execution were classified so that the commander could adjust according to the situation or invoke the control task re-planning algorithm to re-plan. Finally, a simulation platform was built using the QT development environment to simulate human-supervised UUV control task planning and re-planning, verifying the algorithm’s design effectiveness. Full article
(This article belongs to the Section Ocean Engineering)
Show Figures

Figure 1

20 pages, 454 KiB  
Article
Using Large Language Models to Enhance the Reusability of Sensor Data
by Alberto Berenguer, Adriana Morejón, David Tomás and Jose-Norberto Mazón
Sensors 2024, 24(2), 347; https://doi.org/10.3390/s24020347 - 6 Jan 2024
Cited by 1 | Viewed by 1914
Abstract
The Internet of Things generates vast data volumes via diverse sensors, yet its potential remains unexploited for innovative data-driven products and services. Limitations arise from sensor-dependent data handling by manufacturers and user companies, hindering third-party access and comprehension. Initiatives like the European Data [...] Read more.
The Internet of Things generates vast data volumes via diverse sensors, yet its potential remains unexploited for innovative data-driven products and services. Limitations arise from sensor-dependent data handling by manufacturers and user companies, hindering third-party access and comprehension. Initiatives like the European Data Act aim to enable high-quality access to sensor-generated data by regulating accuracy, completeness, and relevance while respecting intellectual property rights. Despite data availability, interoperability challenges impede sensor data reusability. For instance, sensor data shared in HTML formats requires an intricate, time-consuming processing to attain reusable formats like JSON or XML. This study introduces a methodology aimed at converting raw sensor data extracted from web portals into structured formats, thereby enhancing data reusability. The approach utilises large language models to derive structured formats from sensor data initially presented in non-interoperable formats. The effectiveness of these language models was assessed through quantitative and qualitative evaluations in a use case involving meteorological data. In the proposed experiments, GPT-4, the best performing LLM tested, demonstrated the feasibility of this methodology, achieving a precision of 93.51% and a recall of 85.33% in converting HTML to JSON/XML, thus confirming its potential in obtaining reusable sensor data. Full article
(This article belongs to the Section Sensor Networks)
Show Figures

Figure 1

16 pages, 3034 KiB  
Article
Optimizing RTL Code Obfuscation: New Methods Based on XML Syntax Tree
by Hanwen Yi, Jin Zhang and Sheng Liu
Appl. Sci. 2024, 14(1), 243; https://doi.org/10.3390/app14010243 - 27 Dec 2023
Viewed by 874
Abstract
As the most widely used description code in digital circuits and system on chip (SoC), the security of register transfer level (RTL) code is extremely critical. Code obfuscation is a typical method to ensure the security of RTL code, but popular obfuscation methods [...] Read more.
As the most widely used description code in digital circuits and system on chip (SoC), the security of register transfer level (RTL) code is extremely critical. Code obfuscation is a typical method to ensure the security of RTL code, but popular obfuscation methods are not fully applicable to RTL code. In addition, some RTL code obfuscation tools also have issues with incomplete functionality or obfuscation errors. In view of the above issues, this paper studies the RTL code security problem represented by obfuscation. Based on the extensible markup language (XML) syntax tree generated by parsing RTL code, a complete RTL code refactoring model is constructed, and four targeted RTL code obfuscation methods are proposed, namely: Layout obfuscation; Parameter obfuscation; Critical path obfuscation; Code increment obfuscation. Utilizing the developed obfuscation tool, an assessment of the performance and effectiveness of the obfuscation methods is conducted, alongside testing the equivalence between the obfuscated code and the source code. The experimental results show that the proposed obfuscation methods have higher practicability and reliability, and have the characteristics of high obfuscation coverage that can be stable at over 98% and preservation of compiler indicative Comments. Full article
Show Figures

Figure 1

26 pages, 2738 KiB  
Article
Semantic Modelling Approach for Safety-Related Traffic Information Using DATEX II
by J. Javier Samper-Zapater, Julián Gutiérrez-Moret, Jose Macario Rocha, Juan José Martinez-Durá and Vicente R. Tomás
Information 2024, 15(1), 3; https://doi.org/10.3390/info15010003 - 19 Dec 2023
Viewed by 1733
Abstract
The significance of Linked Open Data datasets for traffic information extends beyond just including open traffic data. It incorporates links to other relevant thematic datasets available on the web. This enables federated queries across different data platforms from various countries and sectors, such [...] Read more.
The significance of Linked Open Data datasets for traffic information extends beyond just including open traffic data. It incorporates links to other relevant thematic datasets available on the web. This enables federated queries across different data platforms from various countries and sectors, such as transport, geospatial, environmental, weather, and more. Businesses, researchers, national operators, administrators, and citizens at large can benefit from having dynamic traffic open data connected to heterogeneous datasets across Member States. This paper focuses on the development of a semantic model that enhances the basic service to access open traffic data through a LOD-enhanced Traffic Information System in alignment with the ITS Directive (2010/40/EU). The objective is not limited to just viewing or downloading data but also to improve the extraction of meaningful information and enable other types of services that are only achievable through LOD. By structuring the information using the RDF format meant for machines and employing SPARQL for querying, LOD allows for comprehensive and unified access to all datasets. Considering that the European standard DATEX II is widely used in many priority areas and services mentioned in the ITS Directive, LOD DATEX II was developed as a complementary approach to DATEX II XML. This facilitates the accessibility and comprehensibility of European traffic data and services. As part of this development, an ontological model called dtx_srti, based on the DATEX II Ontology, was created to support these efforts. Full article
(This article belongs to the Special Issue Knowledge Representation and Ontology-Based Data Management)
Show Figures

Figure 1

Back to TopTop