We consider the problem of detecting a subspace signal in white Gaussian noise when the noise power may be different under the null hypothesis--where it is assumed to be known--and the alternative hypothesis. This situation occurs when the presence of the signal of interest (SOI) triggers an increase in the noise power. Accordingly, it may be relevant in the case of a mismatch between the actual SOI subspace and its presumed value, resulting in a modelling error. We derive the generalized likelihood ratio test (GLRT) for the problem at hand and contrast it with the GLRT which assumes known and equal noise power under the two hypotheses. A performance analysis is carried out and the distributions of the two test statistics are derived. From this analysis, we discuss the differences between the two detectors and provide explanations for the improved performance of the new detector. Numerical simulations attest to the validity of the analysis.