Abstract. Our participation at ResPubliQA 2010 was based on applying an Information Retrieval (IR) engine of high performance and a validation step for removing incorrect answers. The IR engine received additional information from the analysis of questions, what produces a slight improvement in results. However, the validation module discarded sometimes too much correct answers, contributing to reduce the overall performance. These errors were due to the application of too strict constraints. Therefore, future work must be focused on reducing the amount of false negatives returned by the validation module. On the other hand, we observed that IR ranking offers important information for selecting the final answer, but better results could be obtained if additional sources of information were also considered.