Sciweavers

386 search results - page 27 / 78
» A tool for average and worst-case execution time analysis
Sort
View
CF
2006
ACM
13 years 9 months ago
Intermediately executed code is the key to find refactorings that improve temporal data locality
The growing speed gap between memory and processor makes an efficient use of the cache ever more important to reach high performance. One of the most important ways to improve cac...
Kristof Beyls, Erik H. D'Hollander
JCB
2006
124views more  JCB 2006»
13 years 7 months ago
The Average Common Substring Approach to Phylogenomic Reconstruction
We describe a novel method for efficient reconstruction of phylogenetic trees, based on sequences of whole genomes or proteomes, whose lengths may greatly vary. The core of our me...
Igor Ulitsky, David Burstein, Tamir Tuller, Benny ...
TC
1998
13 years 7 months ago
Analysis of Checkpointing Schemes with Task Duplication
—This paper suggests a technique for analyzing the performance of checkpointing schemes with task duplication. We show how this technique can be used to derive the average execut...
Avi Ziv, Jehoshua Bruck
RTAS
2010
IEEE
13 years 6 months ago
DARTS: Techniques and Tools for Predictably Fast Memory Using Integrated Data Allocation and Real-Time Task Scheduling
—Hardware-managed caches introduce large amounts of timing variability, complicating real-time system design. One alternative is a memory system with scratchpad memories which im...
Sangyeol Kang, Alexander G. Dean
SIGMETRICS
2003
ACM
199views Hardware» more  SIGMETRICS 2003»
14 years 28 days ago
Data cache locking for higher program predictability
Caches have become increasingly important with the widening gap between main memory and processor speeds. However, they are a source of unpredictability due to their characteristi...
Xavier Vera, Björn Lisper, Jingling Xue