Multiplexing is a common technique for encoding highdimensional image data into a single, two-dimensional image. Examples of spatial multiplexing include Bayer patterns to capture color channels, and integral images to encode light fields. In the Fourier domain, optical heterodyning has been used to acquire light fields. In this paper, we develop a general theory of multiplexing the dimensions of the plenoptic function onto an image sensor. Our theory enables a principled comparison of plenoptic multiplexing schemes, including noise analysis, as well as the development of a generic reconstruction algorithm. The framework also aides in the identification and optimization of novel multiplexed imaging applications.