Intraoperative assistance systems aim to improve the quality of the surgery and enhance the surgeon’s capabilities. Preferable would be a system which provides support depending on the surgery context and surgical skills accomplished. Therefore, the automated analysis and recognition of surgical skills during an intervention is necessary. In this paper a robust tracking of instruments in minimally invasive surgery based on endoscopic image sequences is presented. The instruments were not modified and the tracking was tested on sequences acquired during a real intervention. The generated trajectory of the instruments provides information which can be further used for surgical gesture interpretation.