Abstract— In time varying packet-switched networks, delivering data with high reliability using a limited amount of network resources is highly desirable. To capture the trade-off between the experienced application quality and the resource usage it is important to have good adaptation algorithms. In this paper, we investigate a simple feedback error correction scheme that aims to optimize the amount of redundancy without relying on a accurate model of the loss processes of the network. This extremum-seeking controller is shown to converge to a limit cycle in a neighborhood of the optimum. These result are also evaluated using packet-based simulation with data from real wireless sensor experiments.