Abstract. Gradients are distributed distance estimates used as a building block in many sensor network applications. In large or long-lived deployments, it is important for the estimate to self-stabilize in response to changes in the network or ongoing computations, but existing algorithms may repair very slowly, produce distorted estimates, or suffer large transients. The CRF-Gradient algorithm[1] addresses these shortcomings, and in this paper we prove that it self-stabilizes in O(diameter) time—more specifically, in 4 · diameter/c + k seconds, where k is a small constant and c is the minimum speed of multi-hop message propagation. 1 Context A common building block for distributed computing systems is a gradient—a biologically inspired operation in which each device estimates its distance to the closest device designated as a source of the gradient (Figure 1).3 Gradients are commonly used in systems with multi-hop wireless communication, where the network diameter is likely to...