My point was that Null may not be the end all be all test for sound. Bill Whittlock reiterated the tests done by Dean Jensen way back in '88 but is thinking about revisiting the subject for such things as Digital Workstations or software. I hope he looks closely at computer interactions and CPU speed and how these might also change sounds we hear in an individual application.mhschmieder wrote:Waitaminnit, I think I misread the original post, based on BobK's response and then re-reading some things here.
I thought you were referring to Audio Mastering, but I guess you meant CD Mastering? And in that case, DDP is of prime importance, n'est-ce pas?
If Jensen was right about the spectrum of sound not representative in the tone from one to another then possibly other types of test gear can get to the bottom of why my ears, or, highly exaggerated imagination could make things appear different. I can say this though. I have three pairs of Sennheiser 280 pro cans that are NOT knockoffs and they all sound different to me yet some sound better than others and i have a favorite. So, it's analog too we need to discover about because when i plug them into my Tascam DA-3000 monitoring the same sample rate from my RME and it's headphone output the sound is different as night and day yet, both AD-DA's did the work of presenting me with the same analog feed. Yes maybe impedance but both headphone outputs are as close to DC/low impedance as one would expect.
There are so many reasons sound can change as mentioned above including buffer settings I'm sure there are better answers why mastering software could differ sonically.
I'm tech wise and not all knowing by any means but seek the experience of supper techs like Whittlock and others to tell me how they measure and what's going on with "Null". I hope some of my posts from techs help and thanks for chiming in TO ALL from what ever perspective your coming from, even those who ridicule the possibility that there actually might be other measurements that are needed to sus tone of systems, albeit subtle as they may be.
Bills followup after reading on of these posts with a few interesting experiences:
I had no idea folks were doing a digital version of “null testing”.
In the digital domain, of course, proving that what came out is identical to what went in is ridiculously easy – even to find a single non-identical bit in billions. But maybe that’s not what’s being discussed. The reply does ramble a lot and makes one error: to change the volume of a digital file, you don’t just add a number to each one in the music file, you must multiply each number in the file … and then comes round-off or truncation errors, etc. Manipulating numbers in a digital file, a.k.a. DSP, is just as prone to audible problems as the analog counter-part. The unquestioned benefit of digital is that files can be copied over and over without error (provided, of course, that some sort of on-the-fly concealment or interpolation isn’t at work … as it is in CD players, for example, when the redundancy built into the format can’t fix read errors). All that being said, I’m not sure I even know what “mastering software” does these days. However, I do remember from my years at Capitol (1981-1988), when Capitol was evaluating hardware for CD mastering we found some truly awful sounding sample-rate-converters. The digital hardware wasn’t the problem … it was the math programming that was riddled with overflow and truncation errors … along the lines of forgetting that when you divide by zero, the result is certain to be beyond full-scale … and what does the software do then? Philosophically, there’s always the problem of folks (especially the young) with a superficial knowledge of technology thinking that certain things are either simple or perfect. After you work in engineering long enough, you learn that virtually everything is a tradeoff or, put another way, there’s no free lunch.