Christian M. Adriano
Hasso Plattner Institute, System Analysis and Modeling, Graduate Student
— Fault localization is intensively investigated field with a plethora of useful methods and tools. Most of the tools rely on analysis of code and bug repositories by means of automated solutions and metrics. Suspiciousness is one of such... more
— Fault localization is intensively investigated field with a plethora of useful methods and tools. Most of the tools rely on analysis of code and bug repositories by means of automated solutions and metrics. Suspiciousness is one of such metrics demonstrated to localize possible faulty code with acceptable precision. Our aim is to investigate a method in which a crowd could be enlisted to improve fault localization. Such general strategy is not new and has been in place for a number of years in the form of beta tests. Nevertheless, both approaches have proven applicability and efficacy, combining them poses interesting questions for feasibility and efficacy. Our approach consists of preparing a system for acceptance test in order to collect information on user satisfaction (ok, not ok) and code execution log traces. Moreover, we defined two metrics for suspiciousness at the level of function calls. In this paper, we demonstrate our approach and a proposal for a quantitative evaluation.
Research Interests:
Software debugging comprises most of the software maintenance time and is notorious for requiring high-level skills and application specific knowledge. Crowdsourcing software debugging could lower those barriers by having each programmer... more
Software debugging comprises most of the software maintenance time and is notorious for requiring high-level skills and application specific knowledge. Crowdsourcing software debugging could lower those barriers by having each programmer perform small, self-contained and parallelizable tasks, hence accommodating different levels of availability and expertise. Therefore, such new approach might enable society to tackle massive software development efforts, as for instance, setting a task force of hundreds of programmers to debug and adapt the existing software to be used in an emergency response to a natural catastrophe. This type of effort is unimaginable nowadays due to the high latency in mobilizing the right programmers and organizing their work. Crowdsourcing assists in overcoming these challenges due to the availability of a large base of contributors working towards a common goal. Debugging process is not a sequential task and this leads to the primary issue of dividing the debugging task into microtasks and asking the appropriate questions based on the microtasks for analysis of the software by the crowd. Our paper attempts to provide the solution of dividing the main task into several microtasks by leveraging the structure of the task followed by associating template questions with each of the microtasks. This can assist in reducing the overhead of the individual developer during the debugging process and make crowd debugging a reality.