Abstract--Discrete-input two-dimensional (2-D) Gaussian channels with memory represent an important class of systems, which appears extensively in communications and storage. In spite of their widespread use, the workings of 2-D channels are still very much unknown. In this work, we try to explore their properties from the perspective of estimation theory and information theory. At the heart of our approach is a mapping of a 2-D channel to an undirected graphical model, and inferring its a posteriori probabilities (APPs) using generalized belief propagation (GBP). The derived probabilities are shown to be practically accurate, thus enabling optimal maximum a posteriori (MAP) estimation of the transmitted symbols. Also, the Shannon-theoretic information rates are deduced either via the vector-wise Shannon