We propose a modified discrete HMM that includes a feature weighting discrimination component. We assume that the feature space is partitioned into subspaces and that the relevance weights of the different subspaces depends on the symbols and the states. In particular, we associate a partial probability with each symbol in each subspace. The overall observation state probability is then computed as an aggregation of the partial probabilities and their relevance weights. We consider two aggregation models. The first one is based on a linear combination, while the second one is based on a geometric combination. For both models, we reformulate the Baum-Welch learning algorithm and derive the update equations for the relevance weights and the partial state probabilities. The proposed approach is validated using synthetic and real data sets. The results are shown to outperform the baseline HMM.