Auditory Displays and Assistive Technologies: the use of head movements by visually impaired individuals and their implementation in binaural interfaces

dc.contributor.authorFeakes, Christopher
dc.date.accessioned2019-06-27T07:16:23Z
dc.date.available2019-06-27T07:16:23Z
dc.date.issued2018-11
dc.description.abstractVisually impaired people rely upon audition for a variety of purposes, among these are the use of sound to identify the position of objects in their surrounding environment. This is limited not just to localising sound emitting objects, but also obstacles and environmental boundaries, thanks to their ability to extract information from reverberation and sound reflections- all of which can contribute to effective and safe navigation, as well as serving a function in certain assistive technologies thanks to the advent of binaural auditory virtual reality. It is known that head movements in the presence of sound elicit changes in the acoustical signals which arrive at each ear, and these changes can improve common auditory localisation problems in headphone-based auditory virtual reality, such as front-to-back reversals. The goal of the work presented here is to investigate whether the visually impaired naturally engage head movement to facilitate auditory perception and to what extent it may be applicable to the design of virtual auditory assistive technology. Three novel experiments are presented; a field study of head movement behaviour during navigation, a questionnaire assessing the self-reported use of head movement in auditory perception by visually impaired individuals (each comparing visually impaired and sighted participants) and an acoustical analysis of inter-aural differences and cross- correlations as a function of head angle and sound source distance. It is found that visually impaired people self-report using head movement for auditory distance perception. This is supported by head movements observed during the field study, whilst the acoustical analysis showed that interaural correlations for sound sources within 5m of the listener were reduced as head angle or distance to sound source were increased, and that interaural differences and correlations in reflected sound were generally lower than that of direct sound. Subsequently, relevant guidelines for designers of assistive auditory virtual reality are proposed.en
dc.identifier.urihttps://www.dora.dmu.ac.uk/handle/2086/18149
dc.language.isoenen
dc.publisherDe Montfort Universityen
dc.publisher.departmentFaculty of Computing, Engineering and Mediaen
dc.titleAuditory Displays and Assistive Technologies: the use of head movements by visually impaired individuals and their implementation in binaural interfacesen
dc.typeThesis or dissertationen
dc.type.qualificationlevelDoctoralen
dc.type.qualificationnamePhDen

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
e-thesis submission Chris Feakes.pdf
Size:
10.62 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
4.2 KB
Format:
Item-specific license agreed upon to submission
Description:

Collections