As the software engineering research community continues to propose novel approaches to automated test case generation for REST APIs, researchers face the labor-intensive task of empirically validating their methodologies and comparing them with the state-of-the-art. This process requires assembling a benchmark of case studies (notoriously difficult to find in the context of REST API testing), building and running each API, gathering competitor tools, conducting experimental testing sessions to collect effectiveness and efficiency metrics, and processing the results. These extensive engineering efforts consume time that could be otherwise spent on more research-oriented tasks.
This paper introduces RESTgym, a flexible empirical infrastructure designed to assess the performance of REST API testing tools and facilitate comparative analysis with state-of-the-art approaches. By providing a standardized environment for comparison (currently consisting of 11 benchmark APIs and 6 state-of-the-art tools packed into containers, but easily extensible to add new APIs and tools) and an orchestration engine, RESTgym significantly reduces the time and effort required for researchers to evaluate REST API testing methodologies. The paper details the architecture and components of RESTgym and demonstrates its utility through a practical example, highlighting its potential to speed up research and development in automatedREST API testing. Video: http://tiny.cc/restgym-video
PDF version of the paper.