CD audio is not good enough, CD Standard is bad quality
CD audio is not good enough, CD Standard is bad quality
May 1 2003, 04:29
Group: Members (Donating)
Joined: 7-April 03
From: Newark, CA
Member No.: 5871
Sound quality is a complex subject, and one that has been thrashed
out elsewhere time after time. Nonetheless I'll give it a bit of a
spin here, without getting all technical, in order to justify why I
think CD audio is not good enough.
When I complain about CD sound I am not doing so as some sort of
retrograde vinyl lover who can't change with the times. :-) I am
simply saying that the sound of the CD I am listening to has audible
problems and does not match what I expect the creators wanted. This
can be measured by how closely the CD replicates the master. Of
course in most cases I don't get to hear the master, so much is
guesswork. However, as a trained audio engineer I have *some* idea of
what is expected and can certainly compare CDs to the masters I
CDs improved on vinyl in many ways, notably in reduced noise floor,
phase artifacts, and crosstalk; ease of handling; and accurate
handling of low frequency stereo information. But CDs are inferior to
vinyl in frequency response and degradation characteristics. If a CD
gets a scratch you hear unlistenable white noise; if a record gets a
scratch you hear a DJ. ;-)
Going further, some may "prefer" the sound of vinyl precisely because
of the distortions it introduces. These include rounded signal peaks
and second-order harmonics, as well as the aforementioned phase
issues. All of these introduce a "warm" sound that is palatable to
many. Whether I like that sound or not, I prefer to hear what the
artist intended. If they wanted harmonics they could have used a
tube. And so on.
A bit more is in order about error correction. Most errors are
corrected by CD players, but this can produce tiny glitches of noise
that most people do not notice. I notice them. It's not that I have
better ears; once I point them out you can hear them as well. Of
course, the better the music reproduction system the more noticable
these are. (Though contrary to this, the better the CD player error
correction, the less you'll hear.) For most people with crappy
stereos it's not an issue.
I do not think that there is anything inherently wrong with digital
sound encoding, only that the 44.1KHz sampling rate and 16 bits per
sample are not good enough. Currently, studios use 96KHz and 24 (or
32) bits throughout the recording chain process, and must reduce this
down to consumer standards for replication. There's probably a good
reason why those people most highly trained in listening don't think
CD quality is good enough for recording. It's simply because their
ears tell them so.
You may be interested to know that the current CD standard was a
matter of much compromise between the American, European, and
Japanese manufacturers. I can remember reading some of the research
articles at the time (I was in university). The Japanese insisted
that 100KHz and 24-bit (if memory serves on the exact numbers) were
required for accurate reproduction. But the others argued that no-one
would hear the difference and it would reduce cost and time to market
if the lower standard was adopted. And so, unfortunately, it was.
Another big problem with many CDs is the terrible job of mastering.
Back in vinyl days you really had to know what you were doing to
adjust the master tape to the deficiencies of the medium. There were
relatively few mastering engineers, but they knew their job. Today
almost anyone thinks they can master, and so they do... badly.
So the problems with CD can be summarised as: insufficient frequency
response, insufficient resolution, poor mastering, nasty error
characteritics, and cases that break all the time. ;-)
MP3s inherit all of these except the bit about the cases.
May 1 2003, 05:18
Joined: 16-June 02
Member No.: 2323
I have to agree completely with your statement about poor mastering engineers in the field. The sad part is that a lot of the time, these mastering 'flaws' are completely intentional. I have heard that Billy Corgan said, regarding his recent Zwan album "Mary, Star of the Sea", that he wanted it to be the loudest rock album ever. In the process, he has introduced unnecessary amounts of clipping to the CD. There are also very many independent artists who will master an album with untrained ears.
However, I disagree with the majority of the other points you have raised here. I believe that 44,100 Hz, 16 bit audio is more than adequate to represent music, even with very high-end equipment. Each aspect (frequency and resolution) can be deemed 'sufficient' by using these benchmarks of human hearing, and the nature of the recording studio.
Generally speaking, humans can perceive frequencies around 20kHz. However, such high frequencies are drowned out by others... thus, in music, many people can only distinguish frequencies below 16kHz, and perhaps the best ears can hear up to 18kHz in such a 'noisy' environment. Physics tells us that to accurately represent a frequency (f), we must use a sampling rate 2(f+c), where c is a constant (was it Nyquist? I can't remember my exact numbers and principles, please excuse me) ... thus, 44.1kHz was borne as a standard, as it can sufficiently represent frequencies up to 22kHz.
I have also heard that 16bit resolution has a noise floor lower than that of a silent, empty recording studio. If this is true (sorry, I don't have hard evidence) then it is proof that 16bit is more than enough.
So, why do recording engineers use higher frequency, higher resolution digital audio than the rest of us peons? Can they hear better than us? Although they probably have better trained ears, I must say that this is not the reason. Let me walk you through a scenario.
Start with 12 sources, each 16/44.1 ... let's say a singer, backup vocals, guitar, bass, synth, and various microphones set up around a drum set. Layer these sources atop one another to create a complete recording of a song. During the production phase, it is decided that we should raise the volume of the bass guitar recording in order to give a more 'rich and funky' sound.
What is the result? Noise, noise, distortion, and more noise. 16bit is a great FINAL resolution for listening to audio, but the noise quickly adds up when you have to layer many tracks and mix them. 24/96 is used in the recording studio as a buffer... the noise floor of 12 tracks recorded in 24bits is still lower than 1 track in 16 bits, so it still sounds good. Downsampling is safe, as long as the original sound itself is not of greater resolution than the result (which it shouldn't be, according to our proofs above).
OK, another question... what's with all the SACD and DVD Audio albums being released these days? I have heard many times that these media sound better than CD audio? So where's our theory now?
It has been argued many times on this forum that recording artists, such as our friend Billy Corgan, want to record louder and louder albums. What does this accomplish? Our songs are louder on the radio than the competition. I have heard that it is a common misconception that the louder source sounds better than the quiet source, even though they are identical once you match volumes. Albums released today are so loud that they are clipping CD audio... reaching the limit of 16bit. A 24bit medium provides a greater dynamic range, and thus louder recordings. Eventually, the record companies would have exhausted these formats too... but all this is UNNECESSARY!
And to address your thoughts on dirt and scratches on various media... if an LP is dusty, you will find noticeable static noise coming out of your speakers. If the records are old and warped, they will sounds wobbly, and will likely skip -- as will a scratched LP. Any little imperfection in the record is instantly audible. CDs are obviously digital recordings ... to the CD player, that means "no matter what I read, it's supposed to be either a 1 or a 0" ... introduce reading noise into a digital wave, and you still get a digital wave -- the player can still read the original sound through all the noise, because it can assume what is supposed to be read. Granted, a heavily scratched CD will skip, or even refuse to play... but you can usually have very visible marks on your CD and still have them play perfectly. Also, imperfect error correction in CD players won't introduce 'noise' into the result ... at worst, it will read a 0 instead of a 1, which will result in a tiny pop that lasts about 1/(16 x 44100) of a second. Your brain cannot perceive this.
|Lo-Fi Version||Time is now: 1st February 2015 - 02:16|