We propose a method of calibrating multiple camera systems that operates by adjusting the camera parameters and the 3D shape of objects onto silhouette observations. Our method employs frontier points, which are geometrically meaningful points on object surfaces, to determine the geometrical relations among multiple cameras. In contrast to conventional methods, both camera parameters and the 3D positions of the frontier points are jointly estimated by minimizing the 2D projection errors between the 2D projected positions of the frontier points and observed silhouette contours for all cameras. This method makes it possible to obtain accurate calibration results without using any special instruments. Experimental results using real image data demonstrate the effectiveness of our method.