Heuristic measures for estimating the quality of attributes mostly assume the independence of attributes so in domains with strong dependencies between attributes their performance is poor. Relief and its extension ReliefF are capable of correctly estimating the quality of attributes in classification problems with strong dependencies between attributes. By exploiting local information provided by different contexts they provide a global view. We present the analysis of ReliefF which lead us to its adaptation to regression (continuous class) problems. The experiments on artificial and real-world data sets show that Regressional ReliefF correctly estimates the quality of attributes in various conditions, and can be used for non-myopic learning of the regression trees. Regressional ReliefF and ReliefF provide a unified view on estimating the attribute quality in regression and classification.