In this paper, we propose a novel nonparametric modeling technique, namely Space Kernel Analysis (SKA), as a result of the definition of the space kernel. We analyze the uncertainty of SKA and show that SKA is subjected to the bias/variance dilemma. Nevertheless, we demonstrate that, by a proper choice of the space kernel matrix, SKA is able to balance between the robustness and accuracy and hence outperforms other kernel-based learning methods. The cost function of SKA is derived, and it proves that SKA minimizes the Weighted Least Squared cost function whose weight matrix is diagonal and determined by the space kernel matrix. The parallels between SKA and several other nonparametric modeling techniques are examined. Study shows that the traditional Kernel Regression, General Regression Neural Network, Similarity Based Modeling and Radial Basis Function Network are examples of SKA with specified space kernel matrices.