Development and validation of analysis methods for classification of environments: classification by sound, time, and place

Hearing aids are now fully digital, advanced signal processors that monitor the type of sound coming in to the hearing aid. The sound type is automatically classified into one of several types, such as speech in a quiet place, versus telephone listening, versus in a noisy car. These sound types may require different signal processing from the hearing aid. Accordingly, the hearing aid an automatically switch into different signal processing when the sound type changes. Current methods for sound type classification do not monitor the user’s location. Yet, with increasing smartphone linkage, it may be possible in future to make use of location to help improve the accuracy of classification of the listener’s environment and what type of signal processing might be most helpful. In this project, we will collaborate to build a prototype location-informed sound classifier.

Faculty Supervisor:

Susan Scollie

Student:

Tayyab Shah

Partner:

Unitron Hearing Ltd.

Discipline:

Journalism / Media studies and communication

Sector:

Medical devices

University:

Program:

Accelerate

Current openings

Find the perfect opportunity to put your academic skills and knowledge into practice!

Find Projects