Special Session in IEEE EMBC 2006
Information-Based Support Technologies for Persons with Visual Disabilities

[Japanese]

A special session is proposed in an international conference by a Japanese research group of the Grant-in-Aid Scientific Research of the Japanese Ministry of Education, Culture, Sports, Science and Technology "Basic Research on the Communicative Functions of the Elderly and Persons with Disabilities." The long, official title of the conference is 28th Annual International Conference IEEE Engineering in Medicine and Biology Society, which is abbreviated to IEEE EMBC 2006. In this session, four research groups among the big Grant-in-Aid group are going to present a paper on visual disabilities, use of ICT, and cognition of visual, auditory, and tactile information. In addition to them, we will invite two speakers from the USA and France who have been engaged in cognition and persons with visual impairments. It will be a good chance to get comprehensive information in this area, so please consider attending our session at New York in August-September.

General information of the conference and the titles of the papers to be presented in the session are as follows.

* Conference Title: 28th Annual International Conference IEEE Engineering in Medicine and Biology Society
* Date: August 30-September 3, 2006
* Conference Site and Hotel: New York Marriott Marquis Hotel at Times Square, NYC, NY, USA
* Session Title: Information-Based Support Technologies for Persons with Visual Disabilities
* Session Time: Friday September 1, 10:45-12:15
* Session Room: Marquis C
* Papers to be presented in the session
(1) [Invited paper] E. Sampaio (Laboratoire Brigitte Frybourg, France):
"Cognitive Aspects of Electronic Visuo-Tactile Substitution for People with Visual Disability."
(2) [Invited paper] J. R. Marston (Univ. of California Santa Barbara, USA):
"Using Remote Infrared Audible Cues to Increase Spatial Knowledge Aquisition for Persons Who are Blind or Visually Impaired in a Multimodal Transportaion environment."
(3) M. Miyakawa, Y. Maeda, Y. Miyazawa, and J. Hori (Niigata Univ., Japan):
" A Smart Video Magnifier Controlled by the Visibility Signal of a Low Vision User."
(4) T. Watanabe, S. Oouchi, T. Yamaguchi (Nat. Inst. of Special Education, Japan), M. Shimojo (Univ. of Electro-Communications, Japan) and S. Shimada (Tokyo Metropolitan Industrial Technology Res. Inst.):
"Development of a Measuring System of Contact Force during Braille Reading using an Optical 6-Axis Force Sensor. "
(5) T. Nishimoto, S. Sako, S. Sagayama (Univ. of Tokyo, Japan), K. Ohshima, K. Oda, and T. Watanabe (Tokyo Women's Cristian Univ., Japan):
" Effect of Learning on Listening to Ultra-Fast Synthesized Speech."
(6) M. Miyagi, Y. Horiuchi, M. Nishida, and A. Ichikawa (Chiba Univ., Japan):
"Analysis of prosody in finger Braille using electromyography."
* Session Organaizers: Tetsuya Watanabe (Nat. Inst. of Special Education, Japan) and Michio Miyakawa (Niigata Univ., Japan)

Abstracts of the Papers presented here

(1)
Cognitive Aspects of Electronic Visuo-Tactile Substitution for People with Visual Disability

Eliana Sampaio (Laboratoire Brigitte Frybourg, France)

Abstract
A person who has suffered the total loss of a sensory system has, indirectly, suffered a brain lesion. The loss is usually due to the destruction of receptors and/or neural pathways, rather than of the related brain areas. A number of laboratory studies of sensory substitution have demonstrated that the information from artificial sensory receptors can be delivered to the brain, and subjective experiences of the lost sensory system can be experienced with training, but they have proven to be impractical. A newly developed human-machine interface, with an array of electrical stimulators on the tongue offers the realistic possibility of practical, cosmetically accepted devices for persons with sensory loss. Among the potential beneficiaries are blind, deaf, and tactile insensate (e.g., from spinal cord injury) persons, and persons with vestibular loss. (Reviewed paper)

(2)
Using Remote Infrared Audible Cues to Increase Spatial Knowledge Aquisition for Persons Who are Blind or Visually Impaired in a Multimodal Transportaion environment

James R. Marston (Univ. of California Santa Barbara, USA)

Abstract
This paper reports on a filed test with blind travelers at a large multi-modal train terminal. Participants used their regular methods of navigation or used a form of auditory signage installed in the environment while exploring the area and performing transfer related tasks. Those who used the auditory signage exhibited a better understanding of the spatial relationships among locations. They were able to make more shortcuts and also answered questions that showed they understood the spatial layout and had a much better cognitive map then those who used their regular methods of navigation and orientation. (Reviewed paper)

(3)
A Smart Video Magnifier Controlled by the Visibility Signal of a Low Vision User

Michio Miyakawa, Yoshinobu Maeda, Youichi Miyazawa, and Junichi Hori (Niigata Univ., Japan)

Abstract
A smart video magnifier for the people with visual disabilities is now being developed to assist their stress-free reading. In a video magnifier, the users watch the monitor screen that is displaying the book page to be read. Eye movement is needed for reading a book. The difficulty of character recognition that is dependent on the environmental conditions is reflected to the eye movement. Accordingly, information on the visibility of the user is extracted as physiological signals accompanied by the gazing motion. These signals are basically used to control the video magnifier. The advantages and usefulness of the adaptive-type video magnifier are discussed in this paper.

(4)
Development of a Measuring System of Contact Force during Braille Reading using an Optical 6-Axis Force Sensor.

Tetsuya Watanabe, Susumu Oouchi, Toshimitsu Yamaguchi (Nat. Inst. of Special Education, Japan), Makoto Shimojo, (Univ. of Electro-Communications, Japan) and Sshigenobu Shimada (Tokyo Metropolitan Industrial Technology Res. Inst.)

Abstract
A system was developed by using an optical 6-axis force sensor to measure contact force during Braille reading. In using this system, we have dealt two problems. One is variability of output values depending on the contact point. It was solved by using two transformation techniques. The other is that subjects have to read Braille in an irregular manner. We compared two manners of Braille reading, one-hand vs. two-hands, and found small reduction in reading speed. We have collected data from two Braille readers with this system and shown more minute contact force trajectories quantitatively than those in earlier studies.

(5)
Effect of Learning on Listening to Ultra-Fast Synthesized Speech

Takuya Nishimoto, Shinji Sako, Shigeki Sagayama (The University of Tokyo), Kazue Ohshima, Koichi Oda, and Takayuki Watanabe (Tokyo Woman's Christian University)

Abstract
A text-to-speech synthesizer that would produce easily understandable voices at very fast speaking rates is expected to help persons with visual disability to acquire information effectively with screen reading softwares. We investigated the intelligibility of Japanese Text-to-Speech systems at fast speaking rates, using four-digit random numbers as the vocabulary of the recall test. We also studied the fast and intelligible text-to-speech engine, using HMM-based synthesizer with the corpus with fast speaking rate. As the results, the statistical models trained with the fast speaking corpus was effective. The learning effect was significant in the early stage of the trials and the effect sustained for several weeks.

(6)
Analysis of Prosody in Finger Braille Using Electromyography

Manabi Miyagi, Masafumi Nishida, Yasuo Horiuchi, and Akira Ichikawa (Chiba Univ., Japan)

Abstract
Analysis of Prosody in Finger Braille Using Electromyography Finger braille is one of the communication methods for the deaf blind. The interpreter types braille codes on the fingers of deaf blind. Finger braille seems to be the most suited medium for real-time communication by its quickness and accuracy of transmitting characters. We hypothesize that the prosody information exists in the time structure and strength of finger braille typing. Prosody is the non-linguistic information that has functions to transmit the sentence structure, prominence, emotions and other form of information in real time communication. In this study, we measured the surface electromyography (sEMG) of finger movement to analyze the strength of finger braille typing. We found that the strength of finger movement increases at the beginning of a phrase and a prominent phrase. The result shows the possibility that the prosody in the strength of finger braille can be applied to create an interpreter system for the deaf-blind.

HOME