—We consider a widely applicable model of resource allocation where two sequences of events are coupled: on a continuous time axis (t), network dynamics evolve over time. On a discrete time axis [t], certain control laws update resource allocation variables according to some proposed algorithm. The algorithmic updates, together with exogenous events out of the algorithm’s control, change the network dynamics, which in turn changes the trajectory of the algorithm, thus forming a loop that couples the two sequences of events. In between the algorithmic updates at [t − 1] and [t], the network dynamics continue to evolve randomly as influenced by the previous variable settings at time [t − 1]. The standard way used to avoid the subsequent analytic difficulty is to assume the separation of timescales, which in turn unrealistically requires either slow network dynamics or high complexity algorithms. In this paper, we develop an approach that does not require separation of timescale...