‘AISee’ – headphone like wearable device for visually impaired that detects objects and does not require smartphone pairing

University of Singapore has developed ‘AiSee’, a discreet wearable that combines a 13-megapixel camera with a bone conductive speaker that detects objects in real-time.

If the object is recognized the user is then told what the object is with a brief description, using a computer-generated voice relayed back through the bone-conduction headphones. This also allows anyone wearing the device to hear the world around them, a feature of vital importance to someone relying heavily on their sense of hearing to navigate.

NUS researchers develop AI-powered ‘eye’ for visually impaired people to ‘see’ objects

Shopping for groceries is a common activity for many of us, but for visually impaired people, identifying grocery items can be daunting. A team of researchers from the National University of Singapore’s School of Computing (NUS Computing) introduced AiSee, an affordable wearable assistive device that helps people with visual impairment ‘see’ objects around them with the help of artificial intelligence (AI). 

Individuals with visual impairment face daily hurdles, particularly with object identification which is crucial for both simple and complex decision-making. While breakthroughs in AI have dramatically improved visual recognition capabilities, real-world application of these advanced technologies remains challenging and error-prone. 

AiSee, which was first developed in 2018 and progressively upgraded over a span of five years, aims to overcome these limitations by leveraging state-of-the-art AI technologies. 

“With AiSee, our aim is to empower users with more natural interaction. By following a human-centred design process, we found reasons to question the typical approach of using glasses augmented with a camera. People with visual impairment may be reluctant to wear glasses to avoid stigmatisation. Therefore, we are proposing an alternative hardware that incorporates a discreet bone conduction headphone,” said lead researcher of Project AiSee Associate Professor Suranga Nanayakkara, who is from the Department of Information Systems and Analytics at NUS Computing. 

The user simply needs to hold an object and activate the in-built camera to capture an image of the object. With the help of AI, AiSee will identify the object, and it will also provide more information when queried by the user.

How does AiSee work?

AiSee comprises three key components:

(1) The eye: Vision engine computer software

AiSee incorporates a micro-camera that captures the user’s field of view. This forms the software component of AiSee, also referred to as the ‘vision engine computer’. The software is capable of extracting features such as text, logos, and labels from the captured image for processing.

(2) The brain: AI-powered image processing unit and interactive Q&A system

After the user snaps a photo of the object of interest, AiSee utilises sophisticated cloud-based AI algorithms to process and analyse the captured images to identify the object. The user can also ask a range of questions to find out more about the object.

AiSee employs advanced text-to-speech and speech-to-text recognition and processing technology to identify objects and comprehend the user’s queries. Powered by a large language model, AiSee excels in interactive question-and-answer exchanges, enabling the system to accurately comprehend and respond to the user’s queries in a prompt and informative manner.

Compared to most wearable assistive devices which require smartphone pairing, AiSee operates as a self-contained system that can function independently without the need for any additional devices.

(3) The speaker: Bone conduction sound system

The headphone of AiSee utilises bone conduction technology, which enables sound transmission through the bones of the skull. This ensures that individuals with visual impairment can effectively receive auditory information while still having access to external sounds, such as conversations or traffic noise. This is particularly vital for visually impaired people as environmental sounds provide essential information for decision-making, especially in situations involving safety considerations.

“At present, visually impaired people in Singapore do not have access to assistive AI technology of this level of sophistication. Therefore, we believe that AiSee has the potential to empower visually impaired people to independently accomplish tasks that currently require assistance. Our next step is to make AiSee affordable and accessible to the masses. To achieve this, we are making further enhancements, including a more ergonomic design and a faster processing unit,” explained Assoc Prof Nanayakkara.

NUS student Mark Myres, who helped to test AiSee as a visually impaired user, commented, “A lot of time, assistive devices seem very targeted at totally blind people or visually impaired people. I think AiSee is a good balance. Both visually impaired and blind people could get a lot of benefits from this.”

User testing and further enhancements

Assoc Prof Nanayakkara and his team are currently in discussions with SG Enable in Singapore to conduct user testing with persons with visual impairment. The findings will help to refine and improve AiSee’s features and performance. In addition, B.P. De Silva Holdings Pte Ltd has made a generous gift of S$150,000 to support the project.

BPH’s decision to contribute towards the development of AiSee is rooted in its commitment to corporate social responsibility and a genuine desire to make a positive impact on society with a broader mission of fostering inclusivity and accessibility. Its philanthropic endeavour also reflects its belief in the transformative power of technology to address societal challenges and create a more equitable and inclusive world.

Ms Ku Geok Boon, Chief Executive Officer of SG Enable, said, “Innovative solutions enabled by assistive technologies can change the lives of persons with disabilities, whether in supporting them to live more independently or lowering barriers to employment. As the focal agency and sector enabler for disability and inclusion in Singapore, SG Enable is happy to work with partners like NUS and B.P. De Silva Holdings Pte Ltd to leverage technology to empower persons with disabilities.”

https://news.nus.edu.sg/