In this paper, the problem of scalable video delivery over a timevarying wireless channel is considered. Packet scheduling and buffer management in both Application and Medium Access Control (MAC) layers are jointly considered. Various levels of knowledge of the state of the channel are considered. The control is performed via scalable layer filtering (some scalability layers may be dropped). In all cases, the problem is cast in the context of Markov Decision Processes which allows the design of foresighted policies maximizing some long-term reward. Without channel state observation, the control has to rely on the observation of the level of the MAC buffer only. Experimental results show that even with a lack of knowledge of the channel state, the foresighted control policy provides only a moderate loss in received video quality.