Paper: | SP-P10.4 | ||
Session: | Topics in Speech Enhancement | ||
Time: | Wednesday, May 19, 15:30 - 17:30 | ||
Presentation: | Poster | ||
Topic: | Speech Processing: Speech Enhancement | ||
Title: | AUTOMATED LIP-READING FOR IMPROVED SPEECH INTELLIGIBILITY | ||
Authors: | Kevin Brady; MIT Lincoln Laboratory | ||
Matthew McClain; University of Illinois | |||
Michael Brandstein; MIT Lincoln Laboratory | |||
Thomas Quatieri; MIT Lincoln Laboratory | |||
Abstract: | Various psycho-acoustical experiments have concluded that visual features strongly affect the perception of speech. This contribution is most pronounced in noisy environments where the intelligibility of audio-only speech is quickly degraded. An exploration of the effectiveness for extracted visual features such as lip height and width for improving speech intelligibility in noisy environments is provided in this paper. The intelligibility content of these extracted visual features will be investigated through an intelligibility test on an animated rendition of the video generated from the extracted visual features, as well as on the original video. These experiments demonstrate that the extracted video features do contain important aspects of intelligibility that may be utilized in augmenting speech enhancement and coding applications. Alternatively, these extracted visual features can be transmitted in a bandwidth effective way to augment speech coders. | ||
Back |
Home -||-
Organizing Committee -||-
Technical Committee -||-
Technical Program -||-
Plenaries
Paper Submission -||-
Special Sessions -||-
ITT -||-
Paper Review -||-
Exhibits -||-
Tutorials
Information -||-
Registration -||-
Travel Insurance -||-
Housing -||-
Workshops