(no title)
adinisom | 3 months ago
The traditional solution is an FM system where you give the person speaking a microphone linked to your hearing aids. There are dedicated ones like Phonak Roger. You could probably also use your phone as a microphone if it's bluetooth connected to your headphones or hearing aids.
392|3 months ago
mapt|3 months ago
The tech for isolating a speaker at conversational distances exists. You use half a dozen microphone transducers (minimum; Crappy microphone transducers are cheap and quality is expensive, so just use a bunch of them), and through a combination of phase and intensity they decode relative location, and amplify that phase expectation while suppressing everything that isn't phased like that. Sound is slow, and readily susceptible to real-time triangulation. The math/processing is much easier if the parallaxes are fixed (eg the microphones are arranged in a line array on the top band of a rigid pair of smart glasses), but with a little latency it's not prohibitive for a deformable array to solve for its own relative position as well.
unknown|3 months ago
[deleted]
adinisom|3 months ago
Often people who lose their hearing want to be able to hear in social situations such as restaurants and family gatherings. In this context, the signal and noise have similar properties and are coming from the same direction. Directionality helps but can only do so much. Noise reduction can make hearing aids more comfortable to wear but don't necessarily improve comprehension in challenging situations. Progress here is fantastic -- at the same time it helps to have realistic expectations.
Putting the mic on the person speaking sidesteps the problem -- it's like the rest of the room isn't there.
Semaphor|3 months ago