We provide several enhancements to our previously introduced algorithm for a sequential construction of a hybrid network of radial and perceptron hidden units [6]. At each stage, the algorithm sub-divides the input space in order to reduce the entropy of the data conditioned on the clusters. The algorithm determines if a radial or a perceptron unit is required at a given region of input space, by using the local likelihood of the model under each unit type. Given an error target, the algorithm also determines the number of hidden units. This results in a final architecture which is often much smaller than an RBF network or an MLP. A benchmark on six classification problems is given. The most striking performance improvement is achieved on the vowel data set [8].