Model-checking is becoming an accepted technique for debugging hardware and software systems. Debugging is based on the “Check / Analyze / Fix” loop: check the system against a desired property, producing a counterexample when the property fails to hold; analyze the generated counterexample to locate the source of the error; fix the flawed artifact – the property or the model. The success of model-checking non-trivial systems critically depends on making this Check / Analyze / Fix loop as tight as possible. In this paper, we concentrate on the Analyze part of the debugging loop. To this end, we present a framework for generating, structuring and exploring counterexamples either interactively or with the help of user-specified strategies.