(no title)
cpayne | 9 years ago
She says you can ALWAYS tell the difference. But she also says, "better" depends on the music. If you are playing a song on X-Factor or Australian Idol or even Christmas carols, then you'll never notice. If you are playing a solo for a classical piece (wedding or other ceremony), then it certainly sounds "different".
Better is debatable, but different, yes
iheartmemcache|9 years ago
I'd imagine if 5 of the best sawyers with 5 of the best luthiers and sound engineers in the world all sat down together with the goal of replicating the Strad's sound with 100% accuracy I'm confident the 'unique' sound of the Strad could be replicated. It might take a lot of time and maybe a few hundred k in equipment (a dozen condenser mics placed strategically at a variety of places within the room and near the instrument itself, some vibration analysis equipment on the equipment, etc), but eventually the sawyer will choose the right wood, the engineer would perform acoustic analysis and tell the luthier "ok, that 43 micron chisel shave you just took just did __, 22 more and we match the F# perfectly on the G string".
I think it's very similar to the whole audiophile thing, where at least some of it is placebo. Those $30 monster cables aren't transmitting with any more signal fidelity than your $5 ones (assuming equal gauge copper, proper ohmic termination, blah blah), and I can sit down and prove it to anyone with $400 worth of spec-ans from the 1980s.
Now the "different" vs "better" -- I love my record player but when audiophiles say 'it sounds better' than the raw mix-down digital masters, in terms of the amount of audible audio content objectively this isn't the case. (Edit: See [4]) It sounds different from what you're used to. You like the 'warmth' that a belt drive record through a 1950s speaker produces. You like the hisses and crackles and pops. It's an emotional connection you have. When I was playing guitar with my crappy band, I'd run my digital content through a Tascam quarter inch tape deck just to get that hiss in the tracks before we sent it down for mastering (we didn't have those fancy plug-ins to do it for us then).
The same thing happened with film.[2] TV is generally shot at 30fps, film at 24fps[3]. When the FPS rate increased with the advent of more modern technology and higher sampling rates, people would claim that cinema didn't look 'cinematic'. The motion blur of the analog film experience we grew up with (well, those of us over 25) is something we mentally associated with the experience of going to the cinema. You saw a bunch of people comment on this with the Hobbit being distributed and shown at theatres at 48 FPS. "It just doesn't look right".
--
[1]http://www.thestrad.com/blind-tested-soloists-unable-to-tell...
[2] I'm sorry I don't have a source on this study, but the phenomenon was pretty widespread. Cinema isn't something I've delved too far into but the cinematographer forums have thousands (literally) of threads on FPS discussion.
[3] That's just historically how things broke down -- 24 was a convenient number to match up with actual seconds due to it's easy factorability, so you could get half a second at 12 frames, a quarter at 6, etc. This made it easy back in the day when post-processing involved actually involving cutting nitrate-based film and sound syncing to the video involved physically matching the snap of the clapper on your audio track to the visual of the actual frame of when the hinge closed.
RE: 30 fps, I'm not sure, but I'd guess it had to do with the fact that we're at 60 hertz in the US so you could interlace half a frame a cycle (ie. trigger on the peak of every AC cycle to update the set of horizontal lines modulo two, so the electron gun presumably only had to hit half the lines within that time constraint) Other parts of the world operate at 50hz, which is why I'd imagine you have the PAL standard and 25 fps in those locales. (Pure speculation though, someone who's knows for sure, jump in.)
[4] Operating under the assumption that that they were pressed from the same master source.. I'm limited only to the records I bought which were from bands with too small of a budget to afford two separate masters. (I.e., That mid-90s emo 200 run EP by Knapsack didn't have two masters, hell on tour they were lucky to have a floor to crash on.) Still, poster beneath me makes a very valid point that should be taken into consideration (Though I'm sure you're already aware of whether or not the band had a separate master DAT/master plate made if you like the band enough to care about the nuanced differences between two masters.)
cc439|9 years ago
There is a case to be made that CERTAIN albums will produce an objectively better sound in several respects when compared to the CD/digital version of the master. This is an effect of the "Loudness Wars" that began in the late 80's and still continues, albeit at a lower intensity, today.
The analog nature of the vinyl format places a hard limit on how "hot" one could make the sound by tweaking/compressing the dynamic range. Eventually the mastering would exceed the physical ability of the needle to track the groove which is obviously untenable for everyone involved. CD's on the other hand, can push the range as far as acoustically possible and people who want their new rock album to really ROCK are going to be more impressed by a master that somehow sounds louder than any other album at a given volume setting. Thus, the Loudness Wars began and have progressed to the point where the dynamic range has been pushed into the realm of guaranteed acoustic clipping because it still sounds louder and the loss of fidelity won't matter when the song is blasted from an iPhone speaker.
Obviously, this is an effect limited to specific albums. Even if you buy one a pressing of one of these albums, you aren't guaranteed better sound as the vinyl boom has led to some really half-ass remasters that can be as half-assed as passing the borked digital master through conversion software to ensure it works without making any further effort.
Aside from this one outlier scenario, a song/album/whatever mastered attentively to the CD format will always deliver objectively better sound quality.
Source: http://www.soundmattersblog.com/vinyl-vs-cd-in-the-loudness-...
analog31|9 years ago
Indeed, the behavior of violins has been the subject of intensive study for decades. Every new technique for measuring sound or vibration is applied to violins. There's an article every few years about some new secret discovered in the great fiddles. Good acoustic measurement gear is now more sensitive than human hearing. It was only a matter of time before somebody cracked the code.
An amusing rumor is that the tone of a violin changes over time, due to age and playing, and that the Strads are in decline.
dcsommer|9 years ago
cpayne|9 years ago
lol! That is a brave statement. As @dcsommer said, sample size is tiny.
When you audition for an Opera (well, her auditions anyway...) were always done blind - ie behind a curtain or similar.
When you practice and play at that level, you notice the little things. Same way I can look at C++ source code, or Java, C#, Objective-C or Node.js. I have to stop and think to explain how I know it is C#. I just know.
Again, I didn't say one is better than an other. Just different...
nerdponx|9 years ago
geebee|9 years ago
There is, of course, some correlation between more expensive and better, I'm more talking about what happens once you've gotten away from the cheap mass produced instruments and even the decent but unremarkable student violins. Once you get up into a higher level, instruments may get expensive because you need a highly skilled craftsperson to bring out particular qualities in an instrument that may match up to what a musician needs. That sort of thing can't really be mass produced, and because it requires a lot of attention and construction time from a skilled person, it's expensive too.
This holds true for a lot of things. For instance, what's actually "better", dinner at the french laundry, or a damn good burrito. The burrito actually isn't cheaper because it tastes worse, it's cheaper because it isn't as expensive to produce. This actually holds for a lot of things out there.
More expensive isn't always "better", but interestingly, just because it isn't quantifiably better doesn't mean you aren't getting something for your money when you spend more, either. Sometimes more expensive means high quality and specifically tailored for a particular set of criteria.