This site uses cookies to deliver our services and to ensure you get the best experience. By continuing to use this site, you consent to our use of cookies and acknowledge that you have read and understand our Privacy Policy, Cookie Policy, and Terms
Modeling of the operational rate-distortion characteristics of a signal can significantly reduce the computational complexity of an optimal bit allocation algorithm. In this report, such models are studied.