— Camera sensors constitute an information rich sensing modality with many potential applications in sensor networks. Their effectiveness in a sensor network setting however greatly relies on their ability to calibrate with respect to each other, and other sensors in the field. This paper examines node localization and camera calibration using the shared field of view of camera pairs. Using a new distributed camera sensor network we compare two approaches from computer vision and propose an algorithm that combines a sparse set of distance measurements with image information to accurately localize nodes in 3D. Our algorithms are evaluated using a network of iMote2 nodes equipped with COTS camera modules. The sensor nodes identify themselves to cameras using modulated LED emissions. Our indoor experiments yielded a 2-7cm error in a 6x6m room. Our outdoor experiments in a 30x30m field resulted in errors 20-80cm, depending on the method used.