In this paper, we propose a new framework for the computational learning of formal grammars with positive data. In this model, both syntactic and semantic information are taken into account, which seems cognitively relevant for the modeling of natural language learning. The syntactic formalism used is the one of Lambek categorial grammars and meaning is represented with logical formulas. The principle of compositionality is admitted and defined as an isomorphism applying to trees and allowing to automatically translate sentences into their semantic representation(s). Simple simulations of a learning algorithm are extensively developed and discussed.