We present a range of testing techniques for the Abstract Behavioral Specification (ABS) language and apply them to an industrial case study. ABS is a formal modeling language for highly variable, concurrent, component-based systems. The nature of these systems makes them susceptible to the introduction of subtle bugs that are hard to detect in the presence of steady adaptation. While static analysis techniques are available for an abstract language such as ABS, testing is still indispensable and complements analytic methods. We focus on fully automated testing techniques including black-box and glass-box test generation as well as runtime assertion checking, which are shown to be effective in an industrial setting.
Additional Metadata
Keywords Automated testing, Industrial case study, Black-box testing, Glass-box testing, Runtime assertion checking
ACM Testing and Debugging (acm D.2.5), Specifying and Verifying and Reasoning about Programs (acm F.3.1)
THEME Software (theme 1)
Publisher Springer
Stakeholder Unspecified
Persistent URL dx.doi.org/10.1007/s10009-014-0301-x
Journal International Journal on Software Tools for Technology Transfer
Project Highly Adaptable and Trustworthy Software using Forma Methods
Citation
Wong, P.Y.H, Bubel, R, de Boer, F.S, de Gouw, C.P.T, Gómez-Zamalloa, M, Haehnle, R, … Sindhu, M.A. (2015). Testing abstract behavioral specifications. International Journal on Software Tools for Technology Transfer, 17(1), 107–119. doi:10.1007/s10009-014-0301-x