Feature selection is an important problem for pattern classification systems. Mutual information is a good indicator of relevance between variables, and has been used as a measure in several feature selection algorithms. Because the mutual information could not be calculated directly for continuous data sets in Max-Relevance and Min-Redundancy (mRMR) algorithm, here we combine the mRMR algorithm with fuzzy entropy, which avoids estimating probability density. We test our new algorithm using several different data sets and two different classifiers. According to the comparison between the new algorithm and Max-Dependency, Max-Dependency and Min-Redundancy (mDMR) algorithms, it is proven the new algorithm is feasible and valid.