Loading…
Friday April 10, 2026 3:00pm - 5:00pm GMT+07

Authors - Y. C. A. Padmanabha Reddy, Panigrahi Srikanth, Kavita Goura
Abstract - Advances in Artificial Intelligence, Machine Learning and Internet of Things technologies have enabled wearable devices to sense as well as process and respond to human behaviour in real time. While most wearable devices today are used for health and fitness tracking. Many people face communication challenges such as language barriers, difficulty understanding emotions or social cues, social anxiety and accessibility issues for individuals with hearing or speech impairments. Existing systems often collect data but fail to provide meaningful, real-time assistance during actual human interactions. This research paper presents a literature-based study on AI powered wearable devices designed to support and enhance human communication. The research papers are focusing on intelligent wearables that use multimodal sensors such as microphones, cameras and sensors. These systems apply AI techniques to interpret speech, gestures, facial expressions and emotional signals in real time. The wearable devices considered include everyday consumer-oriented systems such as smart eyewear that provides audio visual assistance and wrist worn wearables that offer haptic feedback. The key focus of this study is to examine how such devices can deliver subtle, real-time support through visual prompts, audio cues or vibrations to improve conversational awareness and user confidence. The expected outcome is to identify current capabilities, practical limitations and design considerations for developing human centric wearable technologies that move beyond passive tracking toward meaningful communication support.
Paper Presenter
Friday April 10, 2026 3:00pm - 5:00pm GMT+07
Virtual Room F Bangkok, Thailand

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link