Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content
Medir el esfuerzo de las personas dedicadas al desarrollo de software ha sido visto muy poco en la ingeniería de software, el SEI propuso un conjunto de prácticas disciplinadas para la gestión del tiempo y mejora de la productividad... more
Medir el esfuerzo de las personas dedicadas al desarrollo de software ha sido visto muy poco en la ingeniería de software, el SEI propuso un conjunto de prácticas disciplinadas para la gestión del tiempo y mejora de la productividad personal de los programadores o ingenieros de software en las tareas de desarrollo y mantenimiento de software, mediante el seguimiento del desempeño planeado frente al desempeño real, al que lo denomino Personal Software Process (PSP) (Humphrey, 2000) por Watts Humphrey en los años 90s. Las investigaciones que se revisaron muestran que a nivel mundial las habilidades con las que los ingenieros van a la industria de la ingeniería de software no cubren las expectativas de la industria, y como PSP es parte de este fortalecimiento de habilidades.
Voice over IP (VoIP) is the transmission of voice and multimedia content over Internet Protocol (IP) networks, this paper reviews models, frameworks and auditing standards proposed to this date to manage VoIP security through a literature... more
Voice over IP (VoIP) is the transmission of voice and multimedia content over Internet Protocol (IP) networks, this paper reviews models, frameworks and auditing standards proposed to this date to manage VoIP security through a literature review, with descriptions of both the historical and philosophical evolution reflecting an adequate knowledge of related research. Three research questions are raised here: RQ1. What are the requirements to be met by a model of security audit in VoIP systems to achieve their goals? RQ2. Today, are there additional attacks that previous works have not considered? RQ3. Which security requirements in the VoIP systems are covered (and which are not covered) by security frameworks? After some discussion about VoIP Protocols, Attacks on VoIP, Information Technology (IT) audit, IT security audits, Frameworks and auditing standards, we present a unified view of VoIP Security Requirements; as well as considering the contributions and disadvantages of framew...
Efforts have been recently made to construct ontologies for network security. The proposed ontologies are related to specific aspects of network security. Therefore, it is necessary to identify the specific aspects covered by existing... more
Efforts have been recently made to construct ontologies for network security. The proposed ontologies are related to specific aspects of network security. Therefore, it is necessary to identify the specific aspects covered by existing ontologies for network security. A review and analysis of the principal issues, challenges, and the extent of progress related to distinct ontologies was performed. Each example was classified according to the typology of the ontologies for network security. Some aspects include identifying threats, intrusion detection systems (IDS), alerts, attacks, countermeasures, security policies, and network management tools. The research performed here proposes the use of three stages: 1. Inputs; 2. Processing; and 3. Outputs. The analysis resulted in the introduction of new challenges and aspects that may be used as the basis for future research. One major issue that was discovered identifies the need to develop new ontologies that relate to distinct aspects of...
The increasing complexity of new applications means GUIs are also getting more complex, and generating tests cases manually for them becomes harder. Generating automatic, good quality GUI test cases is a growing concern in application... more
The increasing complexity of new applications means GUIs are also getting more complex, and generating tests cases manually for them becomes harder. Generating automatic, good quality GUI test cases is a growing concern in application testing. Actions performed by the user on the GUI can be regarded as events, which can be performed in sequences, forming a graph of event sequences, and therefore multiple execution paths or routes, known as test cases, are possible. The quality of a set of test cases is measured by the coverage criteria (all actions or events must be performed at least one time in the set), which depend on the length and partial coverage of each execution path. Finding feasible paths and complying with the coverage criteria is a highly combinatorial problem. For such problems, due to high computing power that it would take to find an exact solution, it is well justified to use heuristics and metaheuristics algorithms, allowing us to find approximate solutions of good...
Research Interests:
... Title: Global water pollution monitoring using a nanosatellite constellation (WAPOSAT) Primary POC: Jhosep Guzman, German Comina, Tarsila Tuesta, Gilberto García,Carlos Negron, ... rivers and water reserves worldwide, linked to the... more
... Title: Global water pollution monitoring using a nanosatellite constellation (WAPOSAT) Primary POC: Jhosep Guzman, German Comina, Tarsila Tuesta, Gilberto García,Carlos Negron, ... rivers and water reserves worldwide, linked to the Amazonas rain forest and its monitoring is ...
Software testing is a key factor on any software project; testing costs are significant in relation to development costs. Therefore, it is essential to select the most suitable testing techniques for a given project to find defects at the... more
Software testing is a key factor on any software project; testing costs are significant in relation to development costs. Therefore, it is essential to select the most suitable testing techniques for a given project to find defects at the lower cost possible in the different testing levels. However, in several projects, testing practitioners do not have a deep understanding of the full array of techniques available, and they adopt the same techniques that were used in prior projects or any available technique without taking into consideration the attributes of each testing technique. Currently, there are researches oriented to support selection of software testing techniques; nevertheless, they are based on static catalogues, whose adaptation to any niche software application may be slow and expensive. In this work, we introduce a content-based recommender system that offer a ranking of software testing techniques based on a target project characterization and evaluation of testing techniques in similar projects. The repository of projects and techniques was completed through the collaborative effort of a community of practitioners. It has been found that the difference between recommendations of SoTesTeR and recommendations of a human expert are similar to the difference between recommendations of two different human experts.
Honeynets originated as a security tool designed to be tracked, attacked and compromised by hypothetical intruders. They consist of network environments and sets of applications, and after being installed and configured with all of these... more
Honeynets originated as a security tool designed to be tracked, attacked and compromised by hypothetical intruders. They consist of network environments and sets of applications, and after being installed and configured with all of these components, the Honeynet is ready to be attacked with the purpose of maintaining a controlled environment for the study of the events that occurred. Through the analysis of these events, it is possible to understand the objectives, tactics and interests that the attackers have for the proposed environment. This paper describes the state of the art of Honeynets, referring to architectures, Honeynet types, tools used in Honeynets, Honeynet models and applications in the real world that are focused on capturing information.
Requirements Elicitation is recognized as one of the most important activity in software development process as it has direct impact on its success. Although there are many proposals for improving this task, still there are issues which... more
Requirements Elicitation is recognized as one of the most important activity in software development process as it has direct impact on its success. Although there are many proposals for improving this task, still there are issues which have to be solved. This paper aims to identify the current status of the latest researches related to software requirements elicitation through general framework for literature review, in order to answer the following research questions: Q1) What aspects have been covered by different proposal of requirements elicitation? Q2) What activities of the requirements elicitation process have been covered? And Q3) What factors influence on requirements elicitation and how? A cross-analysis of the outcome was performed. One of the results showed that requirements elicitation process needs improvements.
The PREDECI model is a model for the preservation of digital evidence-based institutions of criminal investigations where it is essential to preserve evidence that has characteristics of the environment with the purpose of increasing the... more
The PREDECI model is a model for the preservation of digital evidence-based institutions of criminal investigations where it is essential to preserve evidence that has characteristics of the environment with the purpose of increasing the rate of admissibility of the evidence in court. The aim of this study was to establish a deployment guide to facilitate the implementation of a PREDECI model. This guide includes organizational, operational and administrative aspects and establishes an implementation plan, a development plan and a plan of evaluation of the implementation with a revised bibliography and model characteristics. It is necessary to have a political framework, an initial framework, an organizational framework, an administrative framework, a framework of infrastructure and security, a framework of institutions of criminal investigation, a framework for generation of tools and an evaluation framework.
The model to evaluate is a model for the preservation of digital evidence-based institutions of criminal investigations where it is essential to preserve evidence that has characteristics of the environment with the purpose of increasing... more
The model to evaluate is a model for the preservation of digital evidence-based institutions of criminal investigations where it is essential to preserve evidence that has characteristics of the environment with the purpose of increasing the rate of admissibility of the evidence in court. This article aims to evaluate the model and its impact in terms of security, admissibility, and long-term preservation characteristics. We respond to the following research question: Does the model, implemented in a software application for a case study, raise the admissibility of digital evidence in court? Thus, a software application is developed, the unit of study is defined, and the results are analysed. The study determined that the model, when implemented properly and following the guidance of implementation of the model, raises the admissibility of digital evidence in court.
This article's objective is to screen and analyse the common models of digital preservation that exist, the elements, the degree of compliance with the general guidelines, the use of techniques and compliance with specific... more
This article's objective is to screen and analyse the common models of digital preservation that exist, the elements, the degree of compliance with the general guidelines, the use of techniques and compliance with specific requirements as well as to evaluate the need for a solution to the environment of criminal investigation institutions, in the scenario that lacks a specific model. The importance of the preservation of digital objects is currently heavily analysed. Several aspects may serve to make the digital objects worthless, such as the uselessness of hardware, the deficiency of ancient computing formats to support their use, human errors and malicious software. The majority of crimes currently have a digital component, such that governments and the police are obliged by law to indefinitely hold digital evidence for a case's history. Until the presentation of the digital evidence in court, the evidence must be collected, preserved and properly distributed. The systems currently used often involve multiple steps that do not meet the demands of the growing digital world. The volume of digital evidence continues to grow, and these steps will soon become operationally and economically unfeasible for agencies responsible for performing these tasks.
Any digital device generates information that may become valuable evidence in the event of a cybercrime incident, security incident, or cyber-attack, but often the collection, management and preservation of this information is not done... more
Any digital device generates information that may become valuable evidence in the event of a cybercrime incident, security incident, or cyber-attack, but often the collection, management and preservation of this information is not done properly. In the legal field, once information has been obtained from the devices, it is very important to maintain it and preserve it from the initial time, through investigation, until the trial or investigation is concluded, and to preserve it for long term use, in order to avoid it to be tainted, damaged, changed or manipulated and so assuring reliability through the whole process. Preservation of digital evidence is an important aspect when deciding its admissibility in a trial in process, or in any future process, reopened by appeal, or as source of historical information. This paper contains a review of the state of the art about digital preservation in institutions dedicated to criminal investigation, analyzing concepts, related projects, tools and legal support in this area. The motivation of this paper is the idea of finding how close we are to having a framework useful to preserve digital evidence, ensuring integrity, hence increasing its admissibility, and supported by long term preservation technique.
Un problema común a muchas industrias es minimizar los desperdicios en el corte de elementos rectangulares de cantidad y tamaño variables desde láminas de tamaño fijo. El caso que tratamos es el corte de tipo guillotina, y donde el... more
Un problema común a muchas industrias es minimizar los desperdicios en el corte de elementos rectangulares de cantidad y tamaño variables desde láminas de tamaño fijo. El caso que tratamos es el corte de tipo guillotina, y donde el elemento rectangular no tiene un requerimiento de orientación, por lo que las combinaciones posibles aumentan al añadirse la posibilidad de rotación en 90 grados. Anteriormente, otros investigadores han abortado este problema con muchos métodos: programación lineal, métodos exactos, heurísticas y metaheurísticas variadas. En este artículo presentamos un algoritmo GRASP con una heurística de agrupamiento que resuelve este problema. Sus resultados se comparan favorablemente con respecto a los resultados de otras investigaciones, en minimización del desperdicio.
Research Interests:
El problema de planeamiento es muy frecuente en el área de producción y logística de las industrias, porque involucra costos, tiempos, demanda y rentabilidad. La asignación de amarraderos en el puerto marítimo tiene por objetivo apoyar a... more
El problema de planeamiento es muy frecuente en el área de producción y logística de las industrias, porque involucra costos, tiempos, demanda y rentabilidad. La asignación de amarraderos en el puerto marítimo tiene por objetivo apoyar a los procesos de negocio portuario para maximizar utilidades y disminuir costos, atendiendo la demanda y evitando problemas de espera excesiva de las naves en bahía. En el presente trabajo se ha tomando como referencia otros trabajos en puertos marítimos, aeropuertos y trenes de carga, y se ha escogido atacar el problema con el método GRASP multi-objetivo para asignar amarraderos en un momento determinado y planificar la asignación a lo largo de un intervalo de tiempo.
Research Interests:
ABSTRACT Digital preservation is considered as a vital process in order to have long-term information; it has a short history and a fairly coherent and justified fundament. In this article, a comparison of models and standards of digital... more
ABSTRACT Digital preservation is considered as a vital process in order to have long-term information; it has a short history and a fairly coherent and justified fundament. In this article, a comparison of models and standards of digital preservation is developed, as well as the current situation of digital preservation and an analysis about the most outstanding projects that can serve as a basis for the application of an existing preservation model in a specific setting, or the proposal of a new one. A comparison of the existing models and their application in the specific environment of institutions of criminal investigation is performed by determining its applicability to the relevant aspects required in this environment. It is important to bear in mind that for every application domain and its reality, there is a very high probability of requiring a particular model which can be based on the existing ones.
ABSTRACT Exploration in high-risk areas is a topic that has motivated the development of mobile robotics in recent years. Moreover, the incursion of multi-agent systems in this field has opened a lot of solutions and applications. In this... more
ABSTRACT Exploration in high-risk areas is a topic that has motivated the development of mobile robotics in recent years. Moreover, the incursion of multi-agent systems in this field has opened a lot of solutions and applications. In this paper we propose a strategy of exploration and mapping for multi-robot systems in underground mines where toxic gases concentration ( ex.: CO2, CO, Sb) is unknown. The principal algorithm is the behavior control which evaluates the status of each agent and makes decisions that maximize system performance and minimize the cost of them. We will use scanning algorithms based on dynamic graph to reduce bandwidth consumption and use of memory. The system has been tested by simulating several situations such as partial loss of communications or agents.
This research covers the technical and soft skills required in software development, a globally in-demand activity. These skills are required to efficiently perform in a very globalized, competitive field. Previous... more
This research covers the technical and soft skills required in  software  development,  a  globally  in-demand  activity.  These  skills  are  required  to  efficiently  perform  in  a  very globalized,  competitive  field.  Previous  studies  show  the existence  of  a  gap  between  industry  and  academia.  Training  centers  provide  professionals  with  technical  skills, but as of late, the need for soft skills has become apparent,  making  it  important  to  study  and  find  ways  to embed them into the professionals who join the industry. The  application  of  Personal  Software  Process  (PSP) methodology  helps  students  to  strengthen  their  skills  in  the software development process, however, the method is not widespread in academia nor industry.  This article shows  the  importance  of  soft  skills,  according  to  Latin  American companies, and how PSP can reduce the skills gap
Actually, software products are increasing in a fast way and are used in almost all activities of human life. Consequently measuring and evaluating the quality of a software product has become a critical task for many companies. Several... more
Actually, software products are increasing in a fast way and are used in almost all activities of human life. Consequently measuring and evaluating the quality of a software product has become a critical task for many companies. Several models have been proposed to help diverse types of users with quality issues. The development of techniques for building software has influenced the creation of models to assess the quality. Since 2000 the construction of software started to depend on generated or manufactured components and gave rise to new challenges for assessing quality. These components introduce new concepts such as configurability, reusability, availability, better quality and lower cost. Consequently the models are classified in basic models which were developed until 2000, and those based on components called tailored quality models. The purpose of this article is to describe the main models with their strengths and point out some deficiencies. In this work, we conclude that in the present age, aspects of communications play an important factor in the quality of software.
El aprendizaje colaborativo implica tomar decisiones para apoyar el campo educativo con herramientas Web 2.0. Varios estudios descritos en esta revisión realizados bajo los parámetros de un protocolo de investigación, revelan evidencia... more
El aprendizaje colaborativo implica tomar decisiones para apoyar el campo educativo con herramientas Web 2.0. Varios estudios descritos en esta revisión realizados bajo los parámetros de un protocolo de investigación, revelan evidencia empírica relevante que respalda esta afirmación. El objetivo principal de este trabajo es identificar la contribución efectiva de las herramientas de la Web 2.0 en el rendimiento académico a través del aprendizaje colaborativo para contribuir a la eficiencia y la eficacia de la educación. Para lograr este objetivo, consideramos bases de datos como Science Direct, IEEE y Google Scholar. Incluimos estudios empíricos que reflejan específicamente la contribución significativa de las herramientas de la Web 2.0 en el aprendizaje colaborativo de los estudiantes. Ninguno de los estudios informó un efecto perjudicial sobre el aprendizaje; sin embargo, los efectos positivos no se atribuyen necesariamente a las tecnologías en sí, sino a cómo se utilizan las tecnologías y como un concepto de aprendizaje. Los resultados de esta revisión encontraron que los blogs y las redes sociales parecen ser las herramientas Web 2.0 más adecuadas para la aplicación al aprendizaje colaborativo porque permiten la colaboración y los procesos sociales, afectivos y cognitivos, pero se necesita más investigación comparativa. Palabras clave: Aprendizaje colaborativo, Web 2.0, estudios empíricos, métodos de investigación, estilos de aprendizaje. A bibliographic review of empirical studies of Web 2.0 tools for collaborative learning: wikis, blogs, social networks and multimedia repositories Abstract: Collaborative learning involves making decisions to support the educational field with Web 2.0 tools. Several studies described in this review conducted under the parameters of a research protocol reveal relevant empirical
The model to evaluate is a model for the preservation of digital evidence-based institutions of criminal investigations where it is essential to preserve evidence that has characteristics of the environment with the purpose of increasing... more
The model to evaluate is a model for the preservation of digital evidence-based institutions of criminal investigations where it is essential to preserve evidence that has characteristics of the environment with the purpose of increasing the rate of admissibility of the evidence in court. This article aims to evaluate the model and its impact in terms of security, admissibility, and long-term preservation characteristics. We respond to the following research question: Does the model, implemented in a software application for a case study, raise the admissibility of digital evidence in court? Thus, a software application is developed, the unit of study is defined, and the results are analysed. The study determined that the model, when implemented properly and following the guidance of implementation of the model, raises the admissibility of digital evidence in court.
Bring your own device (BYOD) is used for the benefits offered by allowing the use of mobile devices to perform business tasks, but the following questions should be analysed if any organisation want to adopt a BYOD environment: What... more
Bring your own device (BYOD) is used for the benefits offered by allowing the use of mobile devices to perform business tasks, but the following questions should be analysed if any organisation want to adopt a BYOD environment: What threats do currently face mobile devices that are being used in BYOD environments? What are the characteristics of the security models proposed until now in order to manage BYOD? Are there integral security models with the minimum requirements in order to take full advantage of BYOD? What are the basic controls that should be taken into account for designing security policies in a business with a BYOD environment? What are the main differences between security on BYOD environments vs. security on corporate owned mobile device environments, regarding how easy is to meet security needs? By giving answers to these questions, we will be able to have a clear vision of what it takes to adopt BYOD in an organisation.
Honeynets originated as a security tool designed to be tracked, attacked and compromised by hypothetical intruders. They consist of network environments and sets of applications, and after being installed and configured with all of these... more
Honeynets originated as a security tool designed to be tracked, attacked and compromised by hypothetical intruders. They consist of network environments and sets of applications, and after being installed and configured with all of these components, the Honeynet is ready to be attacked with the purpose of maintaining a controlled environment for the study of the events that occurred. Through the analysis of these events, it is possible to understand the objectives, tactics and interests that the attackers have for the proposed environment. This paper describes the state of the art of Honeynets, referring to architectures, Honeynet types, tools used in Honeynets, Honeynet models and applications in the real world that are focused on capturing information.
Software regression testing (RT) verifies previous features on a software product when it is modified or new features are added to it. Because of the nature of regression testing it is a costly process. Different approaches have been... more
Software regression testing (RT) verifies previous features on a software product when it is modified or new features are added to it. Because of the nature of regression testing it is a costly process. Different approaches have been proposed to reduce the costs of this activity, among which are: minimization, prioritization and selection of test cases. Recently, Soft Computing techniques, such as data mining, machine learning and others have been used to make regression testing more efficient and effective. Currently, in different contexts, to a greater or lesser extent, software products have access to databases (DBs). Given this situation, it is necessary to consider regression testing also for software products such as information systems that are usually integrated with or connected to DBs. In this work, we present a selection regression testing approach that utilizes a combination of unsupervised clustering with random values, unit tests and the DB schema to determine the test cases related to modifications or new features added to software products connected to DBs. Our proposed approach is empirically evaluated with two database software applications in a production context. Effectiveness metrics such as test suite reduction, fault detection capability, recall, precision and the F-measure are examined. Our results suggest that the proposed approach is enough effective with the resulting clusters of test cases.
This paper presents a model for the preservation of digital evidence in criminal research institutions. The objective of the model is to include features for the admissibility of evidence in court and provide support to fulfil the legal... more
This paper presents a model for the preservation of digital evidence in criminal research institutions. The objective of the model is to include features for the admissibility of evidence in court and provide support to fulfil the legal requirements and performance of this sector. In addition to the model, there is an implementation guide, which includes an implementation plan, development plan, and evaluation plan. This is intended for criminal research institutions and can be used as a basis and reference for the preservation of digital evidence enabling them to align with business strategies and meet the needs of the institution. This research proposes a framework to continue the preservation of digital evidence, which ensures the better integrity and increases the admissibility of the evidence supported by the techniques of longterm preservation based on the OAIS preservation model.
This article's objective is to screen and analyse the common models of digital preservation that exist, the elements, the degree of compliance with the general guidelines, the use of techniques and compliance with specific requirements as... more
This article's objective is to screen and analyse the common models of digital preservation that exist, the elements, the degree of compliance with the general guidelines, the use of techniques and compliance with specific requirements as well as to evaluate the need for a solution to the environment of criminal investigation institutions, in the scenario that lacks a specific model. The importance of the preservation of digital objects is currently heavily analysed. Several aspects may serve to make the digital objects worthless, such as the uselessness of hardware, the deficiency of ancient computing formats to support their use, human errors and malicious software. The majority of crimes currently have a digital component, such that governments and the police are obliged by law to indefinitely hold digital evidence for a case's history. Until the presentation of the digital evidence in court, the evidence must be collected, preserved and properly distributed. The systems currently used often involve multiple steps that do not meet the demands of the growing digital world. The volume of digital evidence continues to grow, and these steps will soon become operationally and economically unfeasible for agencies responsible for performing these tasks.
Requirements Elicitation is recognized as one of the most important activity in software development process as it has direct impact on its success. Although there are many proposals for improving this task, still there are issues which... more
Requirements Elicitation is recognized as one of the most important activity in software development process as it has direct impact on its success. Although there are many proposals for improving this task, still there are issues which have to be solved. This paper aims to identify the current status of the latest researches related to software requirements elicitation through general framework for literature review, in order to answer the following research questions: Q1) What aspects have been covered by different proposal of requirements elicitation? Q2) What activities of the requirements elicitation process have been covered? And Q3) What factors influence on requirements elicitation and how? A cross-analysis of the outcome was performed. One of the results showed that requirements elicitation process needs improvements.
Software regression testing techniques verify previous functionalities each time software modifications occur or new characteristics are added. With the aim of gaining a better understanding of this subject, in this work we present a... more
Software regression testing techniques verify previous functionalities each time software modifications occur or new characteristics are added. With the aim of gaining a better understanding of this subject, in this work we present a survey of software regression testing techniques applied in the last 15 years; taking into account its application domain, kind of metrics they use, its application strategies and the phase of the software development process where they are applied. From an outcome of 460 papers, a set of 25 papers describing the use of 31 software testing regression techniques were identified. Results of this survey suggest that at the moment of apply a regression testing technique, metrics like cost and fault detection efficiency are the most relevant. Most of the techniques were assessed with instrumented programs (experimental cases) under academic settings. Conversely, we observe a minimum set of software regression techniques applied in industrial settings, mainly...
Using FDTD to study wave propagation in big urban areas requires a huge amount of memory and processing time. This research deals with the problem of memory using a special formulation of 2-D FDTD to save memory. A system of linear... more
Using FDTD to study wave propagation in big urban areas requires a huge amount of memory and processing time. This research deals with the problem of memory using a special formulation of 2-D FDTD to save memory. A system of linear equations is presented as a mathematical equivalent of Yee's equations. A recursive algorithm using Gaussian back-substitution is presented. It was adapted to parallel processing in a network of workstations and used to solve this system. Results that validate this approach are presented and the advantages and future improvements to this method are discussed.
This paper deals with the simulation of big problems using FDTD method. FDTD consumes a lot of memory and computing time. Most research focuses on processing time, but memory is the first obstacle for simulate big problems. The common... more
This paper deals with the simulation of big problems using FDTD method. FDTD consumes a lot of memory and computing time. Most research focuses on processing time, but memory is the first obstacle for simulate big problems. The common FDTD equations are reformulated into a big system of linear equations. No factorization is required. A new recursive algorithm is proposed, and the parallel version of the algorithm is also introduced. Some numerical comparisons are done, and conclusions about the advantages of the algorithm and the required improvements to make it practical.
The wide spread of wireless personal communications in the recent years has brought a new interest in the study of electromagnetic wave propagation and scattering in large areas, such as the urban areas served by telecommunication... more
The wide spread of wireless personal communications in the recent years has brought a new interest in the study of electromagnetic wave propagation and scattering in large areas, such as the urban areas served by telecommunication companies. One of the simulation methods to accomplish this study is the FDTD method, but it is not convenient for large scale problems. In this paper we present a new approach, which transform the Yee's formulas and algorithm into an equivalent representation as a system of linear equations. The coefficient matrix is triangular and no factorization is required. An iterative, quasi-recursive algorithm is shown to be able to deal with 2-D problems in the order of 100,000 cells per side. Besides the explanation of the nature of the algorithm, a numerical example is proposed, in this case a simulation of electromagnetic propagation in an urban environment.
The Center for Technological Information and Communications (CTIC) at the National University of Engineering (UNI for its initials in Spanish) in Lima, Peru, is in charge of the design of a national concept for the implementation of a... more
The Center for Technological Information and Communications (CTIC) at the National University of Engineering (UNI for its initials in Spanish) in Lima, Peru, is in charge of the design of a national concept for the implementation of a Network of ground stations for small satellite missions. Currently CTIC conducts investigation on how to carry out this challenging project for the
In satellite missions, there are many complex factors requiring complex software or hardware design; for example: orbital calculation, Doppler shift correction. In optimization and computer aided design, the use of surrogate models has... more
In satellite missions, there are many complex factors requiring complex software or hardware design; for example: orbital calculation, Doppler shift correction. In optimization and computer aided design, the use of surrogate models has been increasing lately. These models replace a complex calculation or simulation by a simpler one, with good approximation. Neural Networks, Support Vector Machines and DACE models have been used, but Genetic Programming is another way to create surrogate models and little research has been done about it. An advantage of using simpler models in small satellite missions, such as Cubesats, is that they are less demanding regarding circuits (both in money and in power consumption) and memory. If the approximation is good, the surrogate model could be enough. These savings could be multiplied by a factor of 20 or more if the surrogate models are applied into constellations of small satellites, with 20 or more individual satellites involved. In this paper, Genetic programming is compared against Neural Networks for creating surrogate models for orbital calculations and Doppler shift. The models are created by machine learning, that is, the method takes a set of experimental or calculated samples and it uses them to create a model that approximates those samples. Genetic Programming uses an evolutionary approach that evolves trees representing non-structured mathematical functions formed from a alphabet of basic operations (in this paper: constants, +, −, ∗, /, sin, cos, log, exp). The main metrics of success are the maximum absolute error, the MAE (medium absolute error) and RMSE (root mean square error) against a bigger set of validation samples.
The increasing complexity of new applications means GUIs are also getting more complex, and generating tests cases manually for them becomes harder. Generating automatic, good quality GUI test cases is a growing concern in application... more
The increasing complexity of new applications means GUIs are also getting more complex, and generating tests cases manually for them becomes harder. Generating automatic, good quality GUI test cases is a growing concern in application testing. Actions performed by the user on the GUI can be regarded as events, which can be performed in sequences, forming a graph of event sequences, and therefore multiple execution paths or routes, known as test cases, are possible. The quality of a set of test cases is measured by the coverage criteria (all actions or events must be performed at least one time in the set), which depend on the length and partial coverage of each execution path. Finding feasible paths and complying with the coverage criteria is a highly combinatorial problem. For such problems, due to high computing power that it would take to find an exact solution, it is well justified to use heuristics and metaheuristics algorithms, allowing us to find approximate solutions of good quality. Those methods have been successfully used in chemistry, physics, biology, and recently, in software engineering. In this paper, the use of a metaheuristic known as Ant Colony Optimization Algorithm (ACO) for generating test cases is proposed. The ACO metaheuristic has been adapted in order to find individual routes that could lead to a set of test cases of good quality. A individual test, path or route is desirable if it is long (it tests a lot of events or actions) and do not share events (or share few events) with other paths. After a appropriate number of candidate test cases are generated, we express the problem of generating a set of test cases as a set covering problem and then we apply a greedy algorithm to solve it. The result is a set of paths (test cases) with full covering of events with small number of test cases. We present also a problem solved by our method, generating test cases for Windows Wordpad, and discuss the results.

And 8 more

Software testing is a key factor on any software project; testing costs are significant in relation to development costs. Therefore, it is essential to select the most suitable testing techniques for a given project to find defects at the... more
Software testing is a key factor on any software project; testing costs are significant in relation to development costs. Therefore, it is essential to select the most suitable testing techniques for a given project to find defects at the lower cost possible in the different testing levels. However, in several projects, testing practitioners do not have a deep understanding of the full array of techniques available, and they adopt the same techniques that were used in prior projects or any available technique without taking into consideration the attributes of each testing technique. Currently, there are researches oriented to support selection of software testing techniques; nevertheless, they are based on static catalogues, whose adaptation to any niche software application may be slow and expensive. In this work, we introduce a content-based recommender system that offer a ranking of software testing techniques based on a target project characterization and evaluation of testing techniques in similar projects. The repository of projects and techniques was completed through the collaborative effort of a community of practitioners. It has been found that the difference between recommendations of SoTesTeR and recommendations of a human expert are similar to the difference between recommendations of two different human experts.
The PREDECI model is a model for the preservation of digital evidence-based institutions of criminal investigations where it is essential to preserve evidence that has characteristics of the environment with the purpose of increasing the... more
The PREDECI model is a model for the preservation of digital evidence-based institutions of criminal investigations where it is essential to preserve evidence that has characteristics of the environment with the purpose of increasing the rate of admissibility of the evidence in court. The aim of this study was to establish a deployment guide to facilitate the implementation of a PREDECI model. This guide includes organizational, operational and administrative aspects and establishes an implementation plan, a development plan and a plan of evaluation of the implementation with a revised bibliography and model characteristics. It is necessary to have a political framework, an initial framework, an organizational framework, an administrative framework, a framework of infrastructure and security, a framework of institutions of criminal investigation, a framework for generation of tools and an evaluation framework.
Software engineering (SE) practitioners must follow mature and quality processes, so that they can compete in an industry that demands quality products. In order to address this need, SEI proposed a personal methodology called PSP to... more
Software engineering (SE) practitioners must follow mature and quality processes, so that they can compete in an industry that demands quality products. In order to address this need, SEI proposed a personal methodology called PSP to improve the SE skill set of practitioners. In Latin American, there are more than 1200 professionals certified in this methodology, but the number should be higher if indeed PSP helps improving SE skills. This work uses data from job advertisements portals and survey to recruiters to find out what are the soft skills requested by companies in the Latin American software industry and compare them with the responses of PSP professionals regarding what are the strengths of PSP formation. The comparison brought favorable results. Then, a proposal of integration of PSP into the Curriculum of a Software Engineering career is presented.
In an iterative and incremental development environment software regression testing plays an important role; it helps to ensure the reliability in the building process of a software product. The optimization of a regression test depends... more
In an iterative and incremental development environment software regression testing plays an important role; it helps to ensure the reliability in the building process of a software product. The optimization of a regression test depends on the size of the test suite to be executed. Regression testing helps to verify existing modifications (fixing bugs) or verify new features added to a software product. When regression testing is applied to database applications it is necessary to consider aspects of the database along with the product code in order to guarantee a proper verification of the product. In this paper we present an approach for conducting regression tests on database applications under an iterative and incremental development setting.
Voice over IP (VoIP) is the transmission of voice and multimedia content over Internet Protocol (IP) networks, this paper reviews models, frameworks and auditing standards proposed to this date to manage VoIP security through a literature... more
Voice over IP (VoIP) is the transmission of voice and multimedia content over Internet Protocol (IP) networks, this paper reviews models, frameworks and auditing standards proposed to this date to manage VoIP security through a literature review, with descriptions of both the historical and philosophical evolution reflecting an adequate knowledge of related research. Three research questions are raised here: RQ1. What are the requirements to be met by a model of security audit in VoIP systems to achieve their goals? RQ2. Today, are there additional attacks that previous works have not considered? RQ3. Which security requirements in the VoIP systems are covered (and which are not covered) by security frameworks? After some discussion about VoIP Protocols, Attacks on VoIP, Information Technology (IT) audit, IT security audits, Frameworks and auditing standards, we present a unified view of VoIP Security Requirements; as well as considering the contributions and disadvantages of frameworks and auditing standards toward achieving those requirements through a comparative evaluation. It was determined that there is no security framework which considers social engineering attacks in spite of being an important aspect to consider in security management VoIP; also there is no specific framework that covers all categories of security requirements for VoIP system, therefore, a more extensive model is needed.
Efforts have been recently made to construct ontologies for network security. The proposed ontologies are related to specific aspects of network security. Therefore, it is necessary to identify the specific aspects covered by existing... more
Efforts have been recently made to construct ontologies for network security. The proposed ontologies are related to specific aspects of network security. Therefore, it is necessary to identify the specific aspects covered by existing ontologies for network security. A review and analysis of the principal issues, challenges, and the extent of progress related to distinct ontologies was performed. Each example was classified according to the typology of the ontologies for network security. Some aspects include identifying threats, intrusion detection systems (IDS), alerts, attacks, countermeasures, security policies, and network management tools. The research performed here proposes the use of three stages: 1. Inputs; 2. Processing; and 3. Outputs. The analysis resulted in the introduction of new challenges and aspects that may be used as the basis for future research. One major issue that was discovered identifies the need to develop new ontologies that relate to distinct aspects of network security, thereby facilitating management tasks.

Ontologies for Network Security and Future Challenges. Available from: https://www.researchgate.net/publication/315881325_Ontologies_for_Network_Security_and_Future_Challenges [accessed Sep 8, 2017].
Any digital device generates information that may become valuable evidence in the event of a cybercrime incident, security incident, or cyber-attack, but often the collection, management and preservation of this information is not done... more
Any digital device generates information that may become valuable evidence in the event of a cybercrime incident, security incident, or cyber-attack, but often the collection, management and preservation of this information is not done properly. In the legal field, once information has been obtained from the devices, it is very important to maintain it and preserve it from the initial time, through investigation, until the trial or investigation is concluded, and to preserve it for long term use, in order to avoid it to be tainted, damaged, changed or manipulated and so assuring reliability through the whole process. Preservation of digital evidence is an important aspect when deciding its admissibility in a trial in process, or in any future process, reopened by appeal, or as source of historical information. This paper contains a review of the state of the art about digital preservation in institutions dedicated to criminal investigation, analyzing concepts, related projects, tools and legal support in this area. The motivation of this paper is the idea of finding how close we are to having a framework useful to preserve digital evidence, ensuring integrity, hence increasing its admissibility, and supported by long term preservation technique.