Information retrieval is an important problem in any evidence-based discipline. Although Evidencebased Software Engineering (EBSE) is not immune to this fact, this question has not been examined at length. The goal of this paper is to analyse the optimality of search strategies for use in systematic reviews. We tried out 29 search strategies using different terms and combinations of terms. We evaluated their sensitivity and precision with a view to finding an optimum strategy. From this study of search strategies we were able to analyse trends and weaknesses in terminology use in articles reporting experiments.