This paper addresses the problem of remote browsing of 3D scenes. Texture and geometry information are both available at server side in the form of scalably compressed images and depth maps. We propose a framework for the dynamic allocation of the available transmission resources between geometry and texture. Both the transmission of new images and the improvement of the already transmitted ones are taken into account. We also introduce a novel strategy for distortion-sensitive synthesis of both geometry and rendered imagery at client side.