Home > Research > Publications & Outputs > Automated Identification of Performance Changes...

Electronic data

  • 2303.14256

    Other version, 420 KB, PDF document

    Available under license: None

Links

Text available via DOI:

View graph of relations

Automated Identification of Performance Changes at Code Level

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Close
Publication date20/03/2023
Host publicationProceedings - 2022 IEEE 22nd International Conference on Software Quality, Reliability and Security, QRS 2022
PublisherIEEE
Pages916-925
Number of pages10
ISBN (electronic)9781665477048
<mark>Original language</mark>English

Publication series

NameIEEE International Conference on Software Quality, Reliability and Security, QRS
Volume2022-December
ISSN (Print)2693-9177

Abstract

To develop software with optimal performance, even small performance changes need to be identified. Identifying performance changes is challenging since the performance of software is influenced by non-deterministic factors. Therefore, not every performance change is measurable with reasonable effort. In this work, we discuss which performance changes are measurable at code level with reasonable measurement effort and how to identify them. We present (1) an analysis of the boundaries of measuring performance changes, (2) an approach for determining a configuration for reproducible performance change identification, and (3) an evaluation comparing of how well our approach is able to identify performance changes in the application server Jetty compared with the usage of Jetty's own performance regression benchmarks.Thereby, we find (1) that small performance differences are only measurable by fine-grained measurement workloads, (2) that performance changes caused by the change of one operation can be identified using a unit-test-sized workload definition and a suitable configuration, and (3) that using our approach identifies small performance regressions more efficiently than using Jetty's performance regression benchmarks.