We propose new Continuous Hidden Markov Model (CHMM) structure that integrates feature weighting component. We assume that each feature vector could include different subsets of features that come from different sources of information or different feature extractors. We modify the probability density function that characterizes the standard CHMM to include state and component dependent feature relevance weights. To learn the optimal feature weights from the training data, we modify the maximum likelihood based Baum-Welch algorithm and we derive the necessary conditions. The proposed approach is validated using synthetic and real data sets. The results are shown to outperform the standard CHMM.