We propose a novel approach for recognizing hand gestures by analyzing the data streams generated by the sensors attached to the human hands. We utilize the concept of ‘range of motion’ in the movement and exploit this characteristic to analyze the acquired data. We show that since the relative ‘range of motion’ of each section of the hand involved in any gesture is a unique characteristic of that gesture, it provides a unique signature for that gesture across different users. Based on this observation, we propose our approach for hand gesture recognition which addresses two major challenges: user-dependency and devicedependency. Furthermore, we show that our approach neither requires calibration nor involves training. We apply our approach for recognizing ASL signs and show that we can recognize static ASL signs with no training. Our preliminary experiments demonstrate more than 75% accuracy in sign recognition for the ASL static signs.