Sciweavers

HCI
2009

Computer-Assisted Lip Reading Recognition for Hearing Impaired

13 years 9 months ago
Computer-Assisted Lip Reading Recognition for Hearing Impaired
Within the communication process of human beings, the speaker's facial expression and lip-shape movement contains extremely rich language information. The hearing impaired, aside from using residual listening to communicate with other people, can also use lip reading as a communication tool. As the hearing impaired learn the lip reading using a computer-assisted lip-reading system, they can freely learn lip reading without the constraints of time, place or situation. Therefore, we propose a computer-assisted lip-reading system (CALRS) for phonetic pronunciation recognition of the correct lip-shape with an image processing method, object-oriented language and neuro-network. This system can accurately compare the lip image of Mandarin phonetic pronunciation using self-organizing map neuro-network (SOMNN) and extension theory to help hearing impaired correct their pronunciation.
Yun-Long Lay, Hui-Jen Yang, Chern-Sheng Lin
Added 18 Feb 2011
Updated 18 Feb 2011
Type Journal
Year 2009
Where HCI
Authors Yun-Long Lay, Hui-Jen Yang, Chern-Sheng Lin
Comments (0)