Lazy Spilling for a Time-Predictable Stack Cache: Implementation and AnalysisReport as inadecuate

Lazy Spilling for a Time-Predictable Stack Cache: Implementation and Analysis - Download this document for free, or read online. Document in PDF available to download.

1 DTU Compute - Department of Applied Mathematics and Computer Science Lyngby 2 ENSTA ParisTech U2IS - Unité d-Informatique et d-Ingénierie des Systèmes

Abstract : The growing complexity of modern computer architectures increasingly complicates the prediction of the run-time behavior of software. For real-time systems, where a safe estimation of the program-s worst-case execution time is needed, time-predictable computer architectures promise to resolve this problem. A stack cache, for instance, allows the compiler to efficiently cache a program-s stack, while static analysis of its behavior remains easy. Likewise, its implementation requires little hardware overhead. This work introduces an optimization of the standard stack cache to avoid redundant spilling of the cache content to main memory, if the content was not modified in the meantime. At first sight, this appears to be an average-case optimization. Indeed, measurements show that the number of cache blocks spilled is reduced to about 17% and 30% in the mean, depending on the stack cache size. Furthermore, we show that lazy spilling can be analyzed with little extra effort, which benefits the worst-case spilling behavior that is relevant for a real-time system.

Keywords : Lazy Spilling Stack Cache Real-Time Systems Program Analysis

Author: Sahar Abbaspour - Alexander Jordan - Florian Brandner -



Related documents