We introduce, characterize and test a vision-based dead-reckoning system for wearable computing that allows to track the user’s trajectory in an unknown and non-instrumented environment by integrating the optical flow. Only a single inexpensive camera worn on the body is required, which may be reused for other purposes such as HCI. Result show that distance estimates are accurate (6-12%) while rotation tends to be underestimated. The accumulation of errors is compensated by identifying previously visited locations and “closing the loop”; it results in greatly enhanced accuracy. Opportunistic use of wireless signatures is used to identify similar locations. No a-priori knowledge of the environment such as map is needed, therefore the system is well-suited for wearable computing. We identify the limitations of this approach and suggest future improvements.