Loading…
Friday April 10, 2026 12:15pm - 2:15pm GMT+07

Authors - Sriram V.A, Rajkumar P.N, Babu M
Abstract - Imagine smart glasses as the ultimate wearable sidekick— poised to change everything from daily navigation to factory work—but they’re stuck in neutral thanks to tech glitches, user frustrations, and market messiness. Picture powerhouse AI like YOLOv8 smashing object detection for the visually impaired at 92.7% [email protected], with 94% precision, 91% recall, and 0.93 F1-scores, or DeepLabv3+ delivering sharp segmentation at 89% accuracy, 93% precision, 0.82 IoU, and 0.18 RMSE; yet real-world hits like Meta Ray-Bans limp on 85-160 mAh batteries for just 30 minutes of action, eye-tracking wobbles at 1.2◦ RMSE (dreaming of sub-0.5◦), and custom CNNs nail 96% navigation accuracy with 0.12m MAE but guzzle 40% more power in slim designs. Folks love gestures that cut task times 35-40% over voice (gaze hitting 88% precision, just 12% error), but older users battle 25% extra mental strain dropping acceptance to 47%, 60% report gaze-control fatigue, and even Wang’s health- care apps with 95% diagnostic recall lose 30% usability sans standard interfaces—add 1.2 million Ray-Ban Metas sold by 2025 via Llama-3.1’s zippy 87% query accuracy under 2s latency, but 80% privacy jitters, 70% interoperability woes from 101 studies, and Yoo’s stellar 91% industrial boosts (15s/task MAE) all yell for beefier 400+ mAh batteries, ethical AI under 5% false positives, and shared standards to grab that $31.5B market.
Paper Presenter
Friday April 10, 2026 12:15pm - 2:15pm GMT+07
Virtual Room D Bangkok, Thailand

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link