Avatar-Based Sign Language Interpretations for Weather Forecast and other TV Programs

Juhyun Oh, Seonggyu Jeon, Byungsun Kim, Minho Kim, Sangwook Kang, Hyukchul Kwon, Iktae Kim, Youngho Song

Whereas the closed-caption broadcasting is provided for almost the whole broadcast time in Korea, the sign language interpretation is not. By translating the realtime closed captions, it is possible to provide three-dimensional (3D) sign language translations for more broadcast time. We propose a sign language broadcasting system for weather forecasts and extend it for all kinds of TV programs. To find the frequency of each word, we analyzed the last three years’ weather forecast scripts, and an open-domain corpus of about 1.2 million words from the Korean Broadcasting System. We use KorLex, the Korean wordnet, to build the sign language synonym dictionary and for the word sense disambiguation to improve the translation performance. Optically captured sign language motions are used for the 3D avatar to present sign language with motion blending. We developed a mobile on-demand sign language weather forecast application and a realtime sign language interpretation system for all kinds of TV programs.

Print ISSN
Electronic ISSN
2160-2492
Published
2017-01
Content type
Original Research
Keywords
Machine translation, sign language broadcasting
DOI
10.5594/JMI.2016.2632278