Noises in Interaction Traces Data and their Impact on Previous Research Studies. Soh, Z., Drioul, T., Rappe, P., Khomh, F., Gu�h�neuc, Y., & Habra, N. In Carver, J. & Dieste, O., editors, Proceedings of the 9<sup>th</sup> International Symposium of Empirical Software Engineering and Measurement (ESEM), pages 1–10, October, 2015. IEEE CS Press. 10 pages.
Noises in Interaction Traces Data and their Impact on Previous Research Studies [pdf]Paper  abstract   bibtex   
Context: Developers' interaction traces (ITs) are commonly used in software engineering to understand how developers maintain and evolve software systems. Researchers make several assumptions when mining ITs, e.g., edit events are considered to be change activities and the time mined from ITs is considered to be the time spent by the developers performing the maintenance task. Goal: We investigate the extent to which these assumptions are correct. We examine noises in developers'''' ITs data and the impact of these noises on previous results derived from these traces. Approach: We perform an experiment with 15 participants, whom we asked to perform bug-fixing activities and collect Mylyn ITs and VLC video captures. We then investigate noises between the two data sets and propose an approach to correct noises in ITs. Results: We find that Mylyn ITs can miss on average about 6% of the time spent performing a task and contain on average about 28% of false edit-events. We report that these noises may have led researchers to mislabel some participants'''' editing styles in about 34% of the cases and that the numbers of edit-events performed by developers and the times that they spent on tasks are correlated, when they were considered not to be. Conclusion: We show that ITs must be carefully cleaned before being used in research studies.

Downloads: 0