Self-organization is a natural concept that helps complex systems to adapt themselves autonomically to their environment. In this paper, we present a self-organizing framework for multi-cue fusion in embedded imaging. This means that several simple image filters are used in combination to lead to a more robust system behavior. Human motion tracking serves as a show case. The system adapts to changes in the environment while tracking a person. Besides this, system customization can be simplified. The designer just has to select a desired set of image filters for a given task. The system then finds the appropriate parameters, e.g., the weighting of different cues. With the option of partial re-configuration, FPGAs support this type of customization. An FPGA-based prototype implementation demonstrates the feasibility of this approach. Tracking and adaptation work in real-time with 25 FPS and a resolution of 640×480.