Skip to main content

Yar - Assistive Wearable for the Visually Impaired

Yar is a small, wearable pendant that uses AI to help visually impaired people understand their surroundings, read text, navigate and live more independently & freely in their daily life.

  • This is Yar, a assistive wearable for the visually impaired designed to help them live independently

  • Users at Singapore Association of Visually Handicapped testing out prototype v6 of Yar on a bottle

  • Iterations of the Yar prototype that we have gone through over a period of 7 months

  • The latest upcoming version of Yar with LTE, smaller form factor & near instantaneous response times

What it does

Yar solves the problem of limited environmental awareness for visually impaired individuals. It's a compact wearable that recognizes objects, reads text, and describes scenes, enhancing users' independence in navigating their surroundings at a affordable price


Your inspiration

Our journey with Yar began when we met Ajay, a visually impaired consultant at Deloitte & Yajat's close family friend. Hearing about Ajay's daily challenges opened our eyes to the struggles faced by the visually impaired community. Driven by a desire to make a difference, we researched existing AI solutions but found none that were seamlessly integrated, affordable, fast, and accurate. This gap inspired us to create something that could truly enhance independence and quality of life for people like Ajay. We named it yaR, meaning "friend" in Hindi, embodying our hope that it will become a trusted companion for those who need it most.


How it works

Yar is a small, pendant-like device worn around the neck. It houses a high-res camera, microphone, and powerful processor. When a user wants information about their surroundings, they activate Yar with a simple button press or voice command. The camera captures an image of the scene, which is then processed by multimodal LLMs. These LLMs can recognize objects, read text, and understand the overall layout of the environment. The microphone picks up the user's voice commands and questions. Yar's AI interprets the image and the user's query, combining visual understanding with natural language processing. It then generates a clear, concise answer, that is conveyed to the user through the built-in speaker or connected headphones using natural-sounding text-to-speech in a few seconds. It can connect to the internet via LTE, allowing remote access & up-to-date information. Yar also includes memory recall, enabling users to recall information from past interactions.


Design process

It all began when we met Yajat's close family friend Ajay, a consultant at Deloitte who has complete vision impairment. His story opened our eyes to the daily struggles of the visually impaired, & with AI and technology becoming so powerful, we felt compelled to act. Our first prototype was far from elegant - a bulky device with a camera - but it worked, & that sparked our drive to keep going. We gave it to Ajay, who helped us validate the pendant design. The real test came at the Singapore Association of Visually Impaired, where user feedback proved invaluable. One tester pointed out how big the device was, leading us to miniaturize the design with a smaller board and camera. Jason, another tester, struggled with the buttons, so we redesigned for better tactile feedback. We also boosted battery life, expanded the camera's field of view after Linda mentioned missing objects just out of frame, & added vibration alerts for camera obstruction. The pendant design won out when users expressed a preference for wearable over handheld options. Throughout our development journey with users, we continually fine-tuned the AI's speed and memory recall, enhancing object recognition and text reading capabilities for within a few seconds response times to meet the diverse needs of our users.


How it is different

Yar stands out with its sleek, pendant-like design, a unique blend of ease of use and seamless integration. Unlike mobile apps that require constant app opening, photo-taking, & interaction, or bulky smart glasses that can be awkward to wear, Yar's pendant design offers a discreet, comfortable solution. It's activated with a simple voice command or button press, making it effortless for visually impaired users. Yar incorporates LTE connectivity for constant information access without Wi-Fi dependence. Its use of optimized multimodal AI models enables near-instantaneous responses, surpassing slower competitors. Yar combines object recognition, text reading, and scene description in one compact, water-resistant device with long battery life. The memory feature for recalling past information is unique in the market. With its comprehensive features, affordability focus, and practical design, Yar offers a better solution for enhancing independence in daily life.


Future plans

As we continue to work with SAVH, we aim to further miniaturize yaR to enhance comfort and discretion. We also plan to implement constant object detection, allowing the AI to continuously analyze the environment and provide real-time updates to users, rather than relying on button-triggered events. This resulted from users at SAVH asking if the device could discern bus numbers from different buses arriving at a stop. We're also exploring the integration of LiDAR to accurately measure distances, addressing the current limitation in distance estimation and improving yaR's AI capabilities of understanding complex scenes & dynamic environments.


Awards

1517 Medici Grant Student Entrepreneurship Programme - iLab NTU Microsoft Startup Founders Programme Engineering Innovation Challenge


End of main content. Return to top of main content.

Select your location