Digital

Results of a Comparative Study of Code Coverage Tools in Computer Vision

Publikation aus Digital

I. Nica and G. Jakob, K. Juhart and F. Wotawa

2017 IEEE International Conference on Software Testing, Verification and Validation Workshops (ICSTW) , 1/2017

Abstract:

The high quality of computer vision (CV) software has a great impact on the usability of the overall CV systems in real world scenarios. As the usage of standardized quality assurance methods, metrics and tools can ease the work of any CV developer and quickly improve the overall process, we report here on the introduction of code coverage analysis in the field. Hence, one of our goals was to identify a coveragebased testing tool, capable to quickly find deficiencies in the available test suites. The highly varying results reported by different tools for the example application have made us wonder, which might be the reasons for this variation. Another important question for industrial testers is which of the computed values better reflect the real quality of the code. In order to answer these questions, an in depth analysis of the tools and also of the chosen software is further required. In this extended abstract, due to space limitations, we inspect four code coverage tools only with respect to the dissenting
 coverage results and correlations between them.