Remote Virtual Companion via Tactile Codes and Voices for The People With Visual Impairment
10.16476/j.pibb.2023.0053
- VernacularTitle:通过触觉编码和声音为视力障碍者提供远程虚拟陪伴
- Author:
Song GE
1
;
Xuan-Tuo HUANG
2
;
Yan-Ni LIN
1
;
Yan-Cheng LI
2
;
Wen-Tian DONG
3
;
Wei-Min DANG
3
;
Jing-Jing XU
4
;
Ming YI
5
;
Sheng-Yong XU
1
Author Information
1. Key Laboratory for the Physics & Chemistry of Nanodevices, School of Electronics, Peking University, Beijing 100871, China
2. School of Electronics Engineering and Computer Science, Peking University, Beijing 100871, China
3. Peking University Sixth Hospital, Peking University Institute of Mental Health, NHC Key Laboratory of Mental Health (Peking University), National Clinical Research Center for Mental Disorders (Peking University Sixth Hospital), Beijing 100191, China
4. School of Microelectronics, Shandong University, Jinan 250100, China
5. Key Laboratory for Neuroscience, School of Basic Medical Sciences, Neuroscience Research Institute, Department of Neurobiology, School of Public Health, Peking University, Beijing 100191, China
- Publication Type:Journal Article
- Keywords:
artificial visual aid;
remote virtual companion;
tactile code;
visually impaired users;
navigation
- From:
Progress in Biochemistry and Biophysics
2024;51(1):158-176
- CountryChina
- Language:English
-
Abstract:
ObjectiveExisting artificial vision devices can be divided into two types: implanted devices and extracorporeal devices, both of which have some disadvantages. The former requires surgical implantation, which may lead to irreversible trauma, while the latter has some defects such as relatively simple instructions, limited application scenarios and relying too much on the judgment of artificial intelligence (AI) to provide enough security. Here we propose a system that has voice interaction and can convert surrounding environment information into tactile commands on head and neck. Compared with existing extracorporeal devices, our device can provide a larger capacity of information and has advantages such as lower cost, lower risk, suitable for a variety of life and work scenarios. MethodsWith the latest remote wireless communication and chip technologies, microelectronic devices, cameras and sensors worn by the user, as well as the huge database and computing power in the cloud, the backend staff can get a full insight into the scenario, environmental parameters and status of the user remotely (for example, across the city) in real time. In the meanwhile, by comparing the cloud database and in-memory database and with the help of AI-assisted recognition and manual analysis, they can quickly develop the most reasonable action plan and send instructions to the user. In addition, the backend staff can provide humanistic care and emotional sustenance through voice dialogs. ResultsThis study originally proposes the concept of “remote virtual companion” and demonstrates the related hardware and software as well as test results. The system can not only achieve basic guide functions, for example, helping a person with visual impairment to shop in supermarkets, find seats at cafes, walk on the streets, construct complex puzzles, and play cards, but also can meet the demand for fast-paced daily tasks such as cycling. ConclusionExperimental results show that this “remote virtual companion” is applicable for various scenarios and demands. It can help blind people with their travels, shopping and entertainment, or accompany the elderlies with their trips, wilderness explorations, and travels.