top | item 42360898

How to measure frequency response of a speaker at home

57 points| philip-b | 1 year ago |crabman.me

57 comments

order

cluckindan|1 year ago

You need a calibrated measurement microphone to do this for real. Otherwise the microphone’s frequency response will skew the results.

jampekka|1 year ago

You can check whether responses differ without a calibrated microphone.

But regardless here the methodology is very weak, just playing a sine sweep with a spectrum recorder open and eyeballing the frequency magnitudes.

mikewarot|1 year ago

For actual speakers, you could use a second of the same type of speaker as a microphone, and assume the losses are similar, do a slow sine sweep through, and just divide all the dB values in half.

Obviously this will only work if there's a passive crossover network, and nothing active in the speakers.

garyfirestorm|1 year ago

Not just that but the FRF calculation is slightly complex. You need to take into account windowing function, amplitude correction factor, sampling rate and block size of each measurement and %overlap… leakage is a very important thing in signal processing and spectral analysis. Also you need an anechoic environment to capture this, because you would be also capturing room reflections and characteristics of room acoustics.

mattclarkdotnet|1 year ago

iPhone mics are consistent enough to be used for most purposes. Apple provides the correction curves and they are used by apps like Audiotools

crazygringo|1 year ago

I've always wondered... is there some way to mathematically "solve" for this with multiple microphones and multiple speakers?

Like with 2 of each, or 3 of each, where you play the same waveform through every possible pair of speaker and microphone, you can solve some kind of system of matrix equations to determine the only possible combination of responsiveness at each device at each frequency?

Or do you just need a reference microphone with known characteristics, period, end of story, because math can't do it?

(Obviously from a practical perspective you want the reference microphone... I'm just curious about in theory.)

dietr1ch|1 year ago

And how do you calibrate a microphone? With a calibrated speaker? :P

I'm guessing that there must be a way to either cancel out skews and work around this, but I guess it's just easier to start with a calibrated microphone

niobe|1 year ago

True, but of the same importance, the speaker cannot be measured independently of the room in a home setup. You need an anechoic chamber, or at least a large and very quiet space. Even a decent size living room will alter the measurements considerably and of course materials matter too.

I have spent a lot of time measuring rooms and you can't underestimate how they can mess with frequencies, but it's also true that good speakers sound pretty good in almost any room, and bad ones will sound bad.

radicality|1 year ago

I have Lyngdorf amplifier at home, and it came with a calibration microphone. You move it to different locations while the speakers play different tones and measure how the sound interacts with your environment, and creates a listening profile for it.

wl|1 year ago

Looking at those spectrograms, the uncorrected frequency response of the phone's microphone is unlikely to be responsible for most of what you can see.

TacticalCoder|1 year ago

> when Motion+ is connected to my phone via bluetooth, and when it's connected to my laptop via aux-in

Why not for it's always interesting to experiment but one is lossy (all bluetooth codecs are lossy AFAIK) and the other is analog. Probably on top of an already lossy source too. They "sound" the same the same way a pixelized color print of Mona Lisa next to the real Mona Lisa looks identical if you're far enough. You may or may not be able to tell the difference but lossy Bluetooth and analog aux-in don't sound the same.

I decided to go for a simple setup: a Yamaha fully integrated amp that does it all, including a network streamer. And I stream from Qobuz (lossless streaming) and from my own collection of CDs I ripped to FLAC (lossless and bit-perfect rips even though you're ripping from an audio CD, verified with an online DB of hashes from other people who did ripped the same CDs).

So I know that up to amp, it's all lossless. Then the amp does its magic.

It's simple really: even though I can't tell the difference between a pixelized color print of Mona Lisa and the real thing from far enough, I'd still prefer to know I'm actually looking at the real thing.

A Qobuz (lossless) subscription doesn't cost more than a Spotify one (lossy although they announced they'd move to lossless IIRC).

It's 2024: FLAC files are tiny compared to, say, even just a 1080p movie. Bandwith is plenty to stream lossless.

Why even bother with lossy? Lossy audio is tech from a quarter of a century ago.

Fripplebubby|1 year ago

> Why even bother with lossy? Lossy audio is tech from a quarter of a century ago.

I know I'm rehashing the same argument that has been had around and around for decades but - because it often doesn't matter. In 2024 a 320Kbps or whatever high quality lossy source over a recent bluetooth codec into a Chinesium amp + DAC into a mid-range $500 pair of speakers sounds _awesome_ and the amount of time and money you spend going above that may be a fun hobby but it's really not worth the effort for most people.

mandmandam|1 year ago

> I eyeballed the two (red) plots and I think they look more or less identical.

I overlaid those two images [0], and they seem significantly (though not hugely) different to me.

Wouldn't speculate as to why that is though, without checking the consistency of passes with the same setup.

0 - https://imgur.com/a/7GUSmPW (with hue shift for comparison)

philip-b|1 year ago

This is useful, thanks. I should've done it on my own. And checking the consistency of passes would be a good idea too. If you don't mind, I'll update my blog post to include your overlaid image.

lanthade|1 year ago

Oh my. I mean, if this guy is happy with his results then more power to him. However, this write up is a textbook case of the blind leading the blind. I don’t even know where to begin. The classic smiley faced EQ, equating spectograph with transfer function, no discussion of phase or any other time domain considerations. It reminds me of when I was 17 and thought I knew everything about computers. Then I went to college and learned how little it was that I actually knew. The author’s knowledge is me when I was 17. I however own thousands of dollars of measurement mics, audio interfaces, and software, and have experience built on decades of pro audio work using those tools and honing my craft.

It’s not my desire to slam the author but it’s also important that people don’t take this correct methodology.

aljgz|1 year ago

`When disagreeing, please reply to the argument instead of calling names. "That is idiotic; 1 + 1 is 2, not 3" can be shortened to "1 + 1 is 2, not 3".`

`Please don't fulminate. Please don't sneer, including at the rest of the community.`

As a curious person with a degree in Electrical Engineering, and one in Computer Science, very interested in quality audio, etc, I know what you mean.

But please, don't be the person who kills curiousity. This person was curious, did some experiments, and posted the result. If you have time, write some of the major ways that such a measurement is inadequate. If not, link to some resources.

`Please don't post shallow dismissals, especially of other people's work. A good critical comment teaches us something.`

softgrow|1 year ago

Years ago at uni, one group chose as their control systems term project to take a known bad speaker (2 inch from transistor radio), measure its response, then build an inverse function to make it perfect using an analog computer. Don't know what the result was but they did have fun.

mitthrowaway2|1 year ago

That will be tough because any attenuation will have to be cancelled by a gain, but the gain will amplify both noise and signal. So the end result might have the right spectral balance but be noisy in the frequency bands where the original signal was weak.

fatnoah|1 year ago

Probably even more years ago at Uni, my senior project group built an automated analysis system for a large (literally large,club to stadium sized) speaker company.

It was a very cool project that spanned multiple disciplines as we built a phased microphone array, a system to tilt and rotate the speaker, and programmable logic and software to generate and analyze signals.

It was proof-of-concept quality, but was later made real and reduced analysis time from dozens of hours to about 45 minutes. Two of the project members were even hired by the company.

zkd43|1 year ago

I just want to point out that it's a pretty big leap to go from "I observed the same frequency response curve with two different inputs" to "there is no audible difference between the two inputs". There are many other measurements you would need to take to prove or disprove the hypothesis, such as signal to noise ratio and dynamic range. And even then you couldn't really prove it definitively due to the complexity of how humans interpret the sound.

When you use Bluetooth, your speaker is functioning as the DAC (digital to analog converter), but when you use the aux, your computer is functioning as the DAC and also amplifying the signal, so it's reasonable to expect them to sound different.

parpfish|1 year ago

are there any established speaker metrics that would get at something like "reconstruction error"? i've seen total harmonic distortion (THD) published in some spec sheets, but i'm not clear if there's a standard method for how that gets calculated or if it's just marketing hype.

moreover, i'm not sure if/how something like THD relates to accuracy of reconstructing more complex naturalistic (i.e., non-sine wave) signals.

philip-b|1 year ago

Do different DACs frequently sound different? I would expect that it's audiophile's fairy tales.

IAmGraydon|1 year ago

>I eyeballed the two (red) plots and I think they look more or less identical. So I guess there is actually no difference in sound and I just imagined it.

LOL. Why is this even on the front page? He measured it with a phone's mic, then he goes on to "eyeball" the results, which are clearly different even to the eye. He then declares them the same. From top to bottom, everything is wrong with this.

plussed_reader|1 year ago

Sounds like they described a sub-par preamp on the wired input; something to the quality of BT compression.

You can still tell that from the coloring of a local reference mic.

harrall|1 year ago

I own a Soundcore Motion+ and can attest to it to be terribly sounding, even for a Bluetooth speaker.

amelius|1 year ago

This assumes your amp and speaker are linear systems.

wl|1 year ago

Linearity is an excellent assumption for virtually any non-overdriven modern amplifier. THD measurements make sense for speakers, but that's not getting in the way of what the author is attempting to do here.

I don't even know where to begin addressing everything that's wrong with this article, but assuming linearity isn't one of those errors.

tetnis|1 year ago

'I tested different EQ presets on "Levitating" by Dua Lipa (Youtube Music, Music video on Youtube). The song is fantastic, and I've been listening to it on repeat these last few days'

Stopped reading

citruscomputing|1 year ago

can't forget

> The music video is fantastic too and so, so beautiful except for the rap part which I don't like.

jareklupinski|1 year ago

personally I use Korn's "Freak on a Leash", lets me quickly know how low the driver can go

sound1|1 year ago

same here :D