ARIA, ARchitecture for Interactive Arts, is a middleware to process, filter, and fuse sensory inputs and actuate responses in real-time. An ARIA media processing workflow describes how the data sensed through media will be processed and what audio-visual responses will be actuated. Each object streamed between ARIA processing components is subject to transformations, as described by a media workflow graph. The media capture and processing components, such as media filters and fusion operators, are programmable and adaptable; i.e, the delay, size, frequency, and quality/precision characteristics of individual operators can be controlled via a number of parameters. In this paper, we present the underlying model which captures the dynamic nature of the ARIA media processing workflows. This model enables design time verification, optimization, and runtime adaptation.
K. Selçuk Candan, Gisik Kwon, Lina Peng, Ma