A distributed adaptive algorithm to estimate a time-varying signal, measured by a wireless sensor network, is designed and analyzed. The presence of measurement noises and of packet losses is considered. Each node of the network locally computes adaptive weights that guarantee to minimize the estimation error variance. Decentralized conditions on the weights, which ensure the stability of the estimates throughout the overall network, are also considered. A theoretical performance analysis of the scheme is carried out both in the presence of perfect and lossy links. Numerical simulations illustrate performance for various network topologies and packet loss probabilities.