A simple multivariate version of Costa's entropy power inequality is proved. In particular, it is shown that if independent white Gaussian noise is added to an arbitrary multivariate signal, the entropy power of the resulting random variable is a multidimensional concave function of the individual variances of the components of the signal. As a side result, we also give an expression for the Hessian matrix of the entropy and entropy power functions with respect to the variances of the signal components, which is an interesting result in its own right.