IPB

Welcome Guest ( Log In | Register )

2 Pages V   1 2 >  
Reply to this topicStart new topic
CD audio is not good enough, CD Standard is bad quality
Joseph
post May 1 2003, 04:29
Post #1





Group: Members (Donating)
Posts: 108
Joined: 7-April 03
From: Newark, CA
Member No.: 5871



Sound quality is a complex subject, and one that has been thrashed
out elsewhere time after time. Nonetheless I'll give it a bit of a
spin here, without getting all technical, in order to justify why I
think CD audio is not good enough.

When I complain about CD sound I am not doing so as some sort of
retrograde vinyl lover who can't change with the times. :-) I am
simply saying that the sound of the CD I am listening to has audible
problems and does not match what I expect the creators wanted. This
can be measured by how closely the CD replicates the master. Of
course in most cases I don't get to hear the master, so much is
guesswork. However, as a trained audio engineer I have *some* idea of
what is expected and can certainly compare CDs to the masters I
produce.

CDs improved on vinyl in many ways, notably in reduced noise floor,
phase artifacts, and crosstalk; ease of handling; and accurate
handling of low frequency stereo information. But CDs are inferior to
vinyl in frequency response and degradation characteristics. If a CD
gets a scratch you hear unlistenable white noise; if a record gets a
scratch you hear a DJ. ;-)

Going further, some may "prefer" the sound of vinyl precisely because
of the distortions it introduces. These include rounded signal peaks
and second-order harmonics, as well as the aforementioned phase
issues. All of these introduce a "warm" sound that is palatable to
many. Whether I like that sound or not, I prefer to hear what the
artist intended. If they wanted harmonics they could have used a
tube. And so on.

A bit more is in order about error correction. Most errors are
corrected by CD players, but this can produce tiny glitches of noise
that most people do not notice. I notice them. It's not that I have
better ears; once I point them out you can hear them as well. Of
course, the better the music reproduction system the more noticable
these are. (Though contrary to this, the better the CD player error
correction, the less you'll hear.) For most people with crappy
stereos it's not an issue.

I do not think that there is anything inherently wrong with digital
sound encoding, only that the 44.1KHz sampling rate and 16 bits per
sample are not good enough. Currently, studios use 96KHz and 24 (or
32) bits throughout the recording chain process, and must reduce this
down to consumer standards for replication. There's probably a good
reason why those people most highly trained in listening don't think
CD quality is good enough for recording. It's simply because their
ears tell them so.

You may be interested to know that the current CD standard was a
matter of much compromise between the American, European, and
Japanese manufacturers. I can remember reading some of the research
articles at the time (I was in university). The Japanese insisted
that 100KHz and 24-bit (if memory serves on the exact numbers) were
required for accurate reproduction. But the others argued that no-one
would hear the difference and it would reduce cost and time to market
if the lower standard was adopted. And so, unfortunately, it was.

Another big problem with many CDs is the terrible job of mastering.
Back in vinyl days you really had to know what you were doing to
adjust the master tape to the deficiencies of the medium. There were
relatively few mastering engineers, but they knew their job. Today
almost anyone thinks they can master, and so they do... badly.

So the problems with CD can be summarised as: insufficient frequency
response, insufficient resolution, poor mastering, nasty error
characteritics, and cases that break all the time. ;-)

MP3s inherit all of these except the bit about the cases.

-- robin
Go to the top of the page
+Quote Post
bryant
post May 1 2003, 04:48
Post #2


WavPack Developer


Group: Developer (Donating)
Posts: 1297
Joined: 3-January 02
From: San Francisco CA
Member No.: 900



You're absolutely right!! Those CD cases are terrible! And who came up with those original double CD cases? You open them and both CDs immediately fall out and hit the floor! mad.gif
Go to the top of the page
+Quote Post
boojum
post May 1 2003, 04:57
Post #3





Group: Members (Donating)
Posts: 819
Joined: 8-November 02
From: Astoria, OR
Member No.: 3727



Joseph, in general I agree with your post. But there have been some exceptions. MFSL had some very nice CD's; Telarc, too. There are others. Could it be that there is not enough market demand for good quality sound? MFSL went under. I suspect this could be part of the problem, just as it was with LP's. Not all LP's sounded great, but some were just magnificent. The Mercury "Living Presence" recording of the Minneapolis Symphony with single and later dual mics (Telefunken, I believe) for stereo and some other labels were quite nice.

Until there is market demand there will be few CD's which have good technical quality. Kind of like digital TV which has had its quality reduced to allow narrower bandwidth/more stations, I believe. sad.gif


--------------------
Nov schmoz kapop.
Go to the top of the page
+Quote Post
mrosscook
post May 1 2003, 05:16
Post #4





Group: Members
Posts: 82
Joined: 14-December 02
From: Amherst MA
Member No.: 4077



Joseph,

Your post argues that a lot of current CDs are poorly mastered, and I think most people would agree with that; but you combine that with complaints about the current CD standard not being good enough, and much of that is just silly.

To take the most egregious point: can you give one good reason why a consumer CD should have a 100 kHz sampling rate, when almost nobody can hear a frequency much above 22 kHz, if that? Nyquist says you need 44, and people might argue 48 is better; but in a production CD, 100 kHz would be a sinful waste of bits.
Go to the top of the page
+Quote Post
paranoos
post May 1 2003, 05:18
Post #5





Group: Members
Posts: 101
Joined: 16-June 02
From: Toronto
Member No.: 2323



I have to agree completely with your statement about poor mastering engineers in the field. The sad part is that a lot of the time, these mastering 'flaws' are completely intentional. I have heard that Billy Corgan said, regarding his recent Zwan album "Mary, Star of the Sea", that he wanted it to be the loudest rock album ever. In the process, he has introduced unnecessary amounts of clipping to the CD. There are also very many independent artists who will master an album with untrained ears.

However, I disagree with the majority of the other points you have raised here. I believe that 44,100 Hz, 16 bit audio is more than adequate to represent music, even with very high-end equipment. Each aspect (frequency and resolution) can be deemed 'sufficient' by using these benchmarks of human hearing, and the nature of the recording studio.

Generally speaking, humans can perceive frequencies around 20kHz. However, such high frequencies are drowned out by others... thus, in music, many people can only distinguish frequencies below 16kHz, and perhaps the best ears can hear up to 18kHz in such a 'noisy' environment. Physics tells us that to accurately represent a frequency (f), we must use a sampling rate 2(f+c), where c is a constant (was it Nyquist? I can't remember my exact numbers and principles, please excuse me) ... thus, 44.1kHz was borne as a standard, as it can sufficiently represent frequencies up to 22kHz.

I have also heard that 16bit resolution has a noise floor lower than that of a silent, empty recording studio. If this is true (sorry, I don't have hard evidence) then it is proof that 16bit is more than enough.

So, why do recording engineers use higher frequency, higher resolution digital audio than the rest of us peons? Can they hear better than us? Although they probably have better trained ears, I must say that this is not the reason. Let me walk you through a scenario.

Start with 12 sources, each 16/44.1 ... let's say a singer, backup vocals, guitar, bass, synth, and various microphones set up around a drum set. Layer these sources atop one another to create a complete recording of a song. During the production phase, it is decided that we should raise the volume of the bass guitar recording in order to give a more 'rich and funky' sound.

What is the result? Noise, noise, distortion, and more noise. 16bit is a great FINAL resolution for listening to audio, but the noise quickly adds up when you have to layer many tracks and mix them. 24/96 is used in the recording studio as a buffer... the noise floor of 12 tracks recorded in 24bits is still lower than 1 track in 16 bits, so it still sounds good. Downsampling is safe, as long as the original sound itself is not of greater resolution than the result (which it shouldn't be, according to our proofs above).

OK, another question... what's with all the SACD and DVD Audio albums being released these days? I have heard many times that these media sound better than CD audio? So where's our theory now?

It has been argued many times on this forum that recording artists, such as our friend Billy Corgan, want to record louder and louder albums. What does this accomplish? Our songs are louder on the radio than the competition. I have heard that it is a common misconception that the louder source sounds better than the quiet source, even though they are identical once you match volumes. Albums released today are so loud that they are clipping CD audio... reaching the limit of 16bit. A 24bit medium provides a greater dynamic range, and thus louder recordings. Eventually, the record companies would have exhausted these formats too... but all this is UNNECESSARY!

And to address your thoughts on dirt and scratches on various media... if an LP is dusty, you will find noticeable static noise coming out of your speakers. If the records are old and warped, they will sounds wobbly, and will likely skip -- as will a scratched LP. Any little imperfection in the record is instantly audible. CDs are obviously digital recordings ... to the CD player, that means "no matter what I read, it's supposed to be either a 1 or a 0" ... introduce reading noise into a digital wave, and you still get a digital wave -- the player can still read the original sound through all the noise, because it can assume what is supposed to be read. Granted, a heavily scratched CD will skip, or even refuse to play... but you can usually have very visible marks on your CD and still have them play perfectly. Also, imperfect error correction in CD players won't introduce 'noise' into the result ... at worst, it will read a 0 instead of a 1, which will result in a tiny pop that lasts about 1/(16 x 44100) of a second. Your brain cannot perceive this.
Go to the top of the page
+Quote Post
floyd
post May 1 2003, 05:43
Post #6





Group: Members
Posts: 630
Joined: 18-June 02
Member No.: 2332



QUOTE (Joseph @ Apr 30 2003 - 09:29 PM)
Currently, studios use 96KHz and 24 (or
32) bits throughout the recording chain process, and must reduce this
down to consumer standards for replication. There's probably a good
reason why those people most highly trained in listening don't think
CD quality is good enough for recording. It's simply because their
ears tell them so.

Recording studios use high khz and bits to avoid problems when mastering. This doesn't prove that cds are poor quality in any way.
Go to the top of the page
+Quote Post
Delirium
post May 1 2003, 06:14
Post #7





Group: Members
Posts: 300
Joined: 3-January 02
From: Santa Cruz, CA
Member No.: 891



I'm pretty sure you're, frankly, incorrect about the 44.1/16-bit problem. I agree completely about poor mastering, but that's a completely unrelated issue.

To convince me (and probably many others) that 44.1/16-bit is insufficient, you should record some audio in 96 kHz/24-bit, downsample ot 44.1 kHz/16-bit with a high-quality downsampling algorithm, and be able to ABX the two versions. I'm pretty sure that's, if not impossible, at least very unlikely.
Go to the top of the page
+Quote Post
sjk
post May 1 2003, 06:34
Post #8





Group: Members
Posts: 1
Joined: 30-April 03
From: Kailua, HI
Member No.: 6298



QUOTE (paranoos @ Apr 30 2003 - 08:18 PM)
I have heard that Billy Corgan said, regarding his recent Zwan album "Mary, Star of the Sea", that he wanted it to be the loudest rock album ever. In the process, he has introduced unnecessary amounts of clipping to the CD.

That reminds me of the Rip Rowan: Over the Limit article I read a few weeks ago.
Go to the top of the page
+Quote Post
AstralStorm
post May 1 2003, 07:32
Post #9





Group: Members
Posts: 745
Joined: 22-April 03
From: /dev/null
Member No.: 6130



QUOTE (Delirium @ May 1 2003 - 07:14 AM)
To convince me (and probably many others) that 44.1/16-bit is insufficient, you should record some audio in 96 kHz/24-bit, downsample ot 44.1 kHz/16-bit with a high-quality downsampling algorithm, and be able to ABX the two versions.  I'm pretty sure that's, if not impossible, at least very unlikely.

I'd rather say he should upsample 44100Hz/16bit to 96000Hz/24bit
not to lose any additional quality.

QUOTE (Joseph)
Going further, some may "prefer" the sound of vinyl precisely because
of the distortions it introduces. These include rounded signal peaks
and second-order harmonics, as well as the aforementioned phase
issues. All of these introduce a "warm" sound that is palatable to
many. Whether I like that sound or not, I prefer to hear what the
artist intended. If they wanted harmonics they could have used a
tube. And so on.

If the master is designed for LP, it will sound best on LP.
If the master is for CD, it will sound better on CD. That's all.

QUOTE (Joseph)
If a CD
gets a scratch you hear unlistenable white noise; if a record gets a
scratch you hear a DJ. ;-)
:snip:
A bit more is in order about error correction. Most errors are
corrected by CD players, but this can produce tiny glitches of noise
that most people do not notice. I notice them. It's not that I have
better ears; once I point them out you can hear them as well. Of
course, the better the music reproduction system the more noticable
these are. (Though contrary to this, the better the CD player error
correction, the less you'll hear.) For most people with crappy
stereos it's not an issue.

You have very poor CD player, because all modern ones
do very good error correction (C2) and interpolate up to ~8 samples if they can't read audio properly.
That's quite hard to detect, if the CD isn't very very very badly scratched.
At that point LP wouldn't be readable too.
Additionally LP gradually loses high frequencies - use Search function and/or FAQ.


--------------------
ruxvilti'a
Go to the top of the page
+Quote Post
2Bdecided
post May 1 2003, 10:51
Post #10


ReplayGain developer


Group: Developer
Posts: 5364
Joined: 5-November 01
From: Yorkshire, UK
Member No.: 409



Whilst I'm a fan of higher sample rates (in theory - I don't use them at home though! and I'm not going to start the discussion again), I don't buy your arguments about error correction.

Whilst it's easy to find poor condition (or poorly pressed) CDs with errors, and it's easy to find poor players whose transports are not working well and read errors from good discs; it's also easy to find discs and players which work together without finding a single uncorrectable error over the whole disc.

Before my time there, a project was tried at my old university where hardware was set-up to find these "uncorrectable errors" - i.e. the ones that result in the signal being approximated. They found that they were exceptionally rare on undamaged discs. It was not unusual to find entire discs with no such errors.

Likewise, during the mp3 decoder tests I performed, I did a lot of 1x digital copying - using a CD player to play the CD I'd just burnt, and a bit-perfect soundcard to record the result via SP-DIF. These were burnt CD-Rs, not pressed CDs, so more problems may be expected. The CD player was an old Sony (reportedly the first CD player with a digital output - it had been imported from a 110V country because a european model didn't appear until later). I admit that any 1-bit (LSB) errors would have been missed in this test - but any error correction with an accuracy of worse than 1 part in 32768 would have been picked up. It didn't happen at all during the tests.

There are some mysteries of certain CDs and certain CD players not "liking each other", and I couldn't comment on how many players or discs constantly operate near the point of failure. In these circumstances, you could well be right! But for many discs, and many players, I doubt that the "reconstructive" error correction kicks very often at all.

Cheers,
David.
Go to the top of the page
+Quote Post
dev0
post May 1 2003, 11:01
Post #11





Group: Developer
Posts: 1679
Joined: 23-December 01
From: Germany
Member No.: 731



I agree with Paranoos here completely. If mastering was done in a halfway decent way 44.1khz/16bit/stereo is enough for the end user.
About the DVD issue: I would prefer a longer CD quality DVD over this SACD shit any day.
dev0


--------------------
"To understand me, you'll have to swallow a world." Or maybe your words.
Go to the top of the page
+Quote Post
Uosdwis R. Dewoh
post May 1 2003, 12:44
Post #12





Group: Members
Posts: 148
Joined: 29-September 01
Member No.: 70



The CD standard is often taking an unfair amount of criticism that in my opinion should be aimed at the manufacturers of D/A converters and CD players. The format is indeed "old", but even after over two decades it's only very rarely properly implemented. Playback equipment is generally very poorly designed (recording equipment to, for that matter) and issues like jitter have hardly been dealt with at all, until very recently. There's an awful lot of "bad digital" out there, and that's not going to change just by upping the sampling frequency or increasing the number of bits. That's just marketing. (For an idea of what "good digital" can be, cough up $800 and get yourself a Benchmark DAC1.) Mastering, as several of you brought up, is of course also a key issue. Bad mastering will make things sound like crap regardless of format. My stance is that any shortcomings of the CD standard remains the least of the problems anyone involved is facing.

I find it very typical of our times that the Red Book format is now being deemed obsolete and insufficient, before it's even fully realized. Why do it right when you can do it twice and earn billions and billions on selling both hardware and software? The music industry obviosuly think this is a great idea; it's an excellent opportunity to sell us music they've already earned billions on, for the fourth or fifth time. Soon there'll be nothing but remastered old recordings out there. blink.gif

Also, I think it's best in these discussions to separate the production side of audio from the production/distribution side. It's common that people involved in making recordings apply their experiences to the end user, but there's a lot that doesn't apply. Recording in higher resolution may be a very good idea considering you're stacking and processing up to perhaps a hundred tracks, but that doesn't necessarily mean that Joe Consumer (or even Steve Audiophile) needs it for playback.

uosdwis
Go to the top of the page
+Quote Post
DonP
post May 1 2003, 14:32
Post #13





Group: Members (Donating)
Posts: 1477
Joined: 11-February 03
From: Vermont
Member No.: 4955



I believe the 16/44 rates are the numbers that fell out of the other constraints:

1) bit density they could get on the disk at the time
2) A car player has to fit in a DIN size dashboard hole, which limits the disk diameter.
3) number of minutes play time (something about a favorite symphony
of the Sony CEO)

The Nyquist sampling frequency tells you what is mathematically possible,
but doesn't say its easy to get close to it.

Sony was first out with a player. Philips, their partner in the CD standard, came
out later with 4x oversampling which sounded much better even with only
a 14 bit DAC because it didn't need as sharp a low pass filter.

Raising the sample frequency means 1) you don't have to strain to get close to
the Nyquist limit, and 2) a couple more kHz on the frequency range would be nice.
Go to the top of the page
+Quote Post
Joseph
post May 1 2003, 16:32
Post #14





Group: Members (Donating)
Posts: 108
Joined: 7-April 03
From: Newark, CA
Member No.: 5871



paranoos wrote:

> I have to agree completely with your statement about poor mastering
> engineers in the field. The sad part is that a lot of the time,
these
> mastering 'flaws' are completely intentional.

Indeed, extremely "loud" music is all the rage. Since the ear
typically responds to average levels, not peak levels, when judging
loudness, all the mastering engineer has to do is raise the average
level as high as possible in the available dynamic range. To do so
inevitably "introduces unnecessary amounts of clipping" as you say.
Thankfully this rarely happens to any music worth hearing in the
first place -- LOL!

> I believe that 44,100 Hz, 16 bit audio is more than adequate to
> represent music, even with very high-end equipment. Each aspect
(frequency
> and resolution) can be deemed 'sufficient' by using these
benchmarks of
> human hearing, and the nature of the recording studio.

Untrue, as I will show. This is not a matter of belief, but science.

> Generally speaking, humans can perceive frequencies around 20kHz.

The Nyquist numbers are pure theory and do not take into account
implementation. For example, Nyquist requires a perfect low-pass
filter for the digital-to-analogue conversion. Well, such a thing
does not exist! Real filters are not perfect, but rather introduce
frequency, aliasing, and phase anomolies. Though techniques like
oversampling can help, a sampling rate of 96KHz is the perfect
solution, as it puts all of these distortions above the range of
human hearing.

> I have also heard that 16bit resolution has a noise floor lower
than that of
> a silent, empty recording studio. If this is true (sorry, I don't
have hard
> evidence) then it is proof that 16bit is more than enough.

It is not true. The range of sounds from absolutely quiet (achieved
only in an anechoic chamber) to hearing damage is 150dB. A more
reasonable range to reproduce (from a recording studio to a loud
concert) is 130dB. 16-bit recordings reproduce a dynamic range of
only 96dB, whereas 24-bit recordings reproduce 120dB. 16-bit looks
rather limiting, doesn't it?

Here's another view: in music we want to listen to (not the
overcompressed crap) the peaks are much higher than the average
volume. In order to provide room for these peaks, most of the musical
information must be restricted to about half of the available bits.
It follows that to avoid compromising the signal, we need a lot more
dynamic range than 16-bit provides.

> So, why do recording engineers use higher frequency, higher
resolution
> digital audio than the rest of us peons? Can they hear better than
us?

They are trained to hear better than us, yes.

> During the production phase, it is decided that we should raise the
volume
> of the bass guitar recording in order to give a more 'rich and
funky' sound.
>
> What is the result? Noise, noise, distortion, and more noise.

I do not follow. As long as you have sufficient dynamic range to work
in, why should raising the volume result in distortion?

BTW, a good funk sound is not achieved by raising volume, but rather
by adding a goodly amount of fast-attack compression, and a nice EQ
emphasis for the finger pop to get a nice string resonance. Or at
least that's how I'd do it. ;-) (It's also got a lot to do with the
playing, of course.)

> 24/96 is used in the
> recording studio as a buffer... the noise floor of 12 tracks
recorded in
> 24bits is still lower than 1 track in 16 bits, so it still sounds
good.

Perhaps what you are trying to say is that summing 2 signals requires
one either drop the levels 3dB to achieve the same perceived
loudness, or have 3dB dynamic range available for the increased
signal level. So certainly the more bits the better during mixing.
And yes, this is more critical than in the final mix. But it hardly
follows that fewer bits are ok for listening.

> I have heard that it is a common misconception that the louder
source sounds
> better than the quiet source, even though they are identical once
you match
> volumes.

They are not identical once you match perceived loudness, which I
think is what you are saying. The compressed signals will have very
little information above the mean, whereas the uncompressed signal
will have lots. Plus, the compressed signal will have all sorts of
clipping distortion. Just about anyone will actually prefer the
uncompressed signal, as it will sound "livelier" and more dynamic.
People like dynamics -- it provides interest and doesn't tire the
ears.

But if you match the *peaks* instead, then the compressed signal will
sound louder. And we naturally like the louder of two signals, all
other factors being equal. The louder song will jump out of the radio
and sell more copies. This is the raison d'etre of maximum
compression.

The argument for extreme compression is a simplistic one that assumes
dumb listeners, an assumption I for one am not willing to make.

> Albums released today are so loud that they are clipping CD
> audio... reaching the limit of 16bit. A 24bit medium provides a
greater
> dynamic range, and thus louder recordings.

This is by no means why there is a move to higher bit depth, since an
extremely compressed song at any bit depth will sound the same. All
the lower bits are essentially unused no matter how many of them
there are.

> And to address your thoughts on dirt and scratches on various
media... if an
> LP is dusty, you will find noticeable static noise coming out of
your
> speakers. If the records are old and warped, they will sounds
wobbly, and
> will likely skip -- as will a scratched LP. Any little imperfection
in the
> record is instantly audible.

Any decent turntable can play any decently maintained vinyl record
with an almost complete lack of background noise. Heck, even my mid-
range Linn does a fine job. Traditional comments like these about bad
vinyl quality come from people who have never heard a decent hi-fi in
the first place.

> CDs are obviously digital recordings ... to the
> CD player, that means "no matter what I read, it's supposed to be
either a 1
> or a 0" ... introduce reading noise into a digital wave, and you
still get a
> digital wave -- the player can still read the original sound
through all the
> noise, because it can assume what is supposed to be read.

Wow, this is so wrong. If I am trying to read 10010011101001 and
there is a scratch and I get 10000000000000 then how exactly am I
supposed to recover the original?

True, Red Book audio uses interpolation to add redundancy to the
signal, but a big enough scratch and all is lost.

> Granted, a heavily
> scratched CD will skip, or even refuse to play... but you can
usually have
> very visible marks on your CD and still have them play perfectly.

As an aside, you can also have invisible "marks" and have it play
badly.

> Also, imperfect error correction in CD players won't
introduce 'noise'
> into the result

Um, yes it will. It is clearly audible.

> ... at worst, it will read a 0 instead of a 1, which will result in
a
> tiny pop that lasts about 1/(16 x 44100) of a second. Your brain
cannot
> perceive this.

Not even if there are thousands of them in a row?

There are other problems with CD reproduction as well, like jitter
and single-bit distortion. The great thing about 96/24 recording is
that many of the challenges of 44/16 go away. There is no need for
dither, brick-wall filters, etc. Reproduction equipment can actually
be simpler and yet achieve sonic excellence.
Go to the top of the page
+Quote Post
DigitalMan
post May 1 2003, 18:06
Post #15





Group: Members
Posts: 488
Joined: 27-March 02
From: California, USA
Member No.: 1631



QUOTE (Joseph @ May 1 2003 - 07:32 AM)
paranoos wrote:
> Generally speaking, humans can perceive frequencies around 20kHz.

The Nyquist numbers are pure theory and do not take into account implementation. For example, Nyquist requires a perfect low-pass filter for the digital-to-analogue conversion. Well, such a thing does not exist! Real filters are not perfect, but rather introduce
frequency, aliasing, and phase anomolies. Though techniques like oversampling can help, a sampling rate of 96KHz is the perfect solution, as it puts all of these distortions above the range of human hearing.

> I have also heard that 16bit resolution has a noise floor lower than that of a silent, empty recording studio. If this is true (sorry, I don't have hard evidence) then it is proof that 16bit is more than enough.

It is not true. The range of sounds from absolutely quiet (achieved only in an anechoic chamber) to hearing damage is 150dB. A more reasonable range to reproduce (from a recording studio to a loud
concert) is 130dB. 16-bit recordings reproduce a dynamic range of only 96dB, whereas 24-bit recordings reproduce 120dB. 16-bit looks rather limiting, doesn't it?

Here's another view: in music we want to listen to (not the overcompressed crap) the peaks are much higher than the average volume. In order to provide room for these peaks, most of the musical information must be restricted to about half of the available bits.
It follows that to avoid compromising the signal, we need a lot more dynamic range than 16-bit provides.

Any decent turntable can play any decently maintained vinyl record with an almost complete lack of background noise. Heck, even my mid-range Linn does a fine job. Traditional comments like these about bad vinyl quality come from people who have never heard a decent hi-fi in the first place.

> CDs are obviously digital recordings ... to the
> CD player, that means "no matter what I read, it's supposed to be either a 1 or a 0" ... introduce reading noise into a digital wave, and you still get a
> digital wave -- the player can still read the original sound through all the noise, because it can assume what is supposed to be read.

Wow, this is so wrong. If I am trying to read 10010011101001 and
there is a scratch and I get 10000000000000 then how exactly am I
supposed to recover the original?

True, Red Book audio uses interpolation to add redundancy to the
signal, but a big enough scratch and all is lost.

> Also, imperfect error correction in CD players won't
introduce 'noise'
> into the result

Um, yes it will. It is clearly audible.

> ... at worst, it will read a 0 instead of a 1, which will result in a tiny pop that lasts about 1/(16 x 44100) of a second. Your brain cannot > perceive this.

Not even if there are thousands of them in a row?

There are other problems with CD reproduction as well, like jitter and single-bit distortion. The great thing about 96/24 recording is that many of the challenges of 44/16 go away. There is no need for dither, brick-wall filters, etc. Reproduction equipment can actually be simpler and yet achieve sonic excellence.

Joseph,
Need to add some facts to help clear up a lot of misunderstanding:

1) Nyquist does not require a perfect LPF. I have personally tested old, inexpensive (Technics) non-oversampling CD players with steep analog LPFs that can reproduce perfect 20kHz sinewaves with <0.01% distortion and reasonable phase shift. 96kHz is not necessary to reproduce high frequencies up to 20kHz accurately.

2) 24 bit dynamic range is 144dB, not 120dB (dynamic range is 6dB x # of bits). Of course you will not find A/D or D/A converters systems with more than 115dB dynamic range, so nobody has ever really recorded a true 24 bit signal that I am aware of. Are there microphone amplifiers with >144dB dynamic range? I suspect not.

In addition, proper mastering of a CD should include noise shaping which can increase the effective midrange dynamic range of a CD by 8 to 12dB (or more), giving an effective dynamic range of over 110dB from redbook CD. Not only that, but a properly noise shaped and dithered digital signal can reproduce audible signals below the noise floor (because the noise is not correlated with the signal it can be perceived), effectively increasing the resolution slightly further.

Assuming a well mastered CD with a dynamic range of 110dB, most recording chains and certianly most consumer playback gear could not reproduce this signal accurately enough to warrant even greater bit depth. For those that can reproduce it, a clean 110dB dynamic range should sound quite good without the need to transition to a completely different audio format.

16 bit does not look very limited for consumer distribution, does it?

3) LP playback with an almost complete lack of background noise? Yes, I've heard extremely high end vinyl reproduction and it sounds very nice, but complete lack of background noise it is not. Even extremely good phono gear can not typically achieve more than 80dB s/n ratio, and good gear in the 70dB range is not uncommon. All phono gear has a pretty steep high pass filter to remove strong low frequency resonance of the tonearm / cartridge system and turntable bearing rumble which could damage your equipment at worst and dramatically increase distortion at best. LPs are relatively noisy compared to good 16 bit digital audio.

4) There are several layers of error correction in the CD format, and uncorrectable errors are actually not very common (see posts above on the topic, they are accurate). Yes, a scratch can result in unrecoverable errors just like it does on an LP. But a reasonably well maintained CD should have no unrecoverable errors. The CD format uses eight to fourteen modulation and Reed-Solomon error correction systems and then uses interpolation and finally muting as last resorts. The system is very robust, and if you commonly enounter audible errors than I suggest you replace damaged CDs or invest in a good quality CD player. You can purchase calibrated test CDs that can measure the error correction capability of a CD player. My Philips CD player can fully correct (no errors, data fully recovered, no noise introduced) a 1.2mm scratch - that should be a rare level of damage, anything less is a non issue. In practice my experience is that error correction is not an issue for CD, so I find your claims to be unusual and suspect.

5) Jitter is time based errors - samples converted to the correct amplitude at the slightly wrong time. While jitter during recording can not be prevented, it can be eliminated during playback with a good buffer or PLL system. Jitter has been shown to be a non-issue (creating distortion well below the 16 bit dynamic range)with well designed playback systems. As a matter of fact, timing errors become more significant as the sampling rate increases so a 96kHz 24 bit system requires jitter several orders of magnitude below that required for a 44/16 system to prevent compromising the 24 bit dynamic range. And designing equipment for 144dB dynamic range is not simpler than designing for 110dB. I'm not sure what you mean by single-bit distortion, but if that refers to errors at the LSB level, then see item #4.

Bottom line - 96/24 may be a good tool for the studio but I do not see any evidence that it is beneficial to the consumer.


--------------------
Was that a 1 or a 0?
Go to the top of the page
+Quote Post
KikeG
post May 1 2003, 20:28
Post #16


WinABX developer


Group: Developer
Posts: 1578
Joined: 1-October 01
Member No.: 137



Well, all there issues have been discussed here from some time, and most of them pretty thoroughly.

I strongly believe 44.1 KHz 16-bit audio is more than enough for transparent reproduction under real-world listening conditions.

About read errors in cds: as 2decided says, this is a non-issue as long as the cd you are playing is not badly scratched. Cds have redundant information recorded, so physical errors can be recovered 100% with no lost data at all, most of the times. Only when there are too many errors of the error is too big, the data can't be recovered, and it has to be interpolated to reduce audibility of the remaining error. Even then, this interpolation, if not constant, I believe is quite bening, or in other words, inaudible.

About using 96 KHz is all studios, well, I'd say this is not generalized. I'd say some use 96 KHz and some still use 44.1 or 48 KHz. I'd also say that 24 bits is generalized, but just useful for mixing and such, not as a release format.

About 44.1 KHz sampling rate not being enough today or filter limitations, any decent 44.1 KHz DAC is free of aliasing problems, frequency response problems, phase problems, ripple problems, etc, up to around 21 KHz or more. How many people can hear up to 21 KHz?

Also, this is the first time I hear a sampling rate of 100 KHz was proposed for the cd. I don't think that happened, but I could be wrong. What I know is that 14 bits were going to be used, because were considered to be enough, but at last, 16 bits were used for convenience, being exactly 2 bytes.

About dynamic range of cd-audio... Well, using flat dither, it is about 94 dB. Using noise shaped dither, it can be the equivalent of around 110 dB. Back here in real world, even 94 dB is more than enough. If you take into account the ambient noise in a quiet listening room (around 30 dB) and realistic listening levels (110 dB peak as much?), 94 dB is more than enough to cover dynamic range requirements. Not to say that you won't find any recording that makes full use of those 94 dB of dynamic range.

About mastering engineers saying that 96 KHz sounds better, I have read about some experienced ones saying that it sounds the same to their ears. Anyway, ever heard of expectation effects or placebo effect? I won't believe that 96 KHz sampled music sounds any different than 44.1 KHz sampled music to anyone until I see a rigorously performed blind listening test that proves it, up to this moment I have not seen any.

Edit: jitter is a non-issue in good (and most average) players, there are plenty of measurements that support this. Take a look at http://www.pcavtech.com

Edit: if today's cds are badly mastered, it is not a fault of the format.

This post has been edited by KikeG: May 2 2003, 00:10
Go to the top of the page
+Quote Post
Pio2001
post May 1 2003, 21:22
Post #17


Moderator


Group: Super Moderator
Posts: 3936
Joined: 29-September 01
Member No.: 73



Joseph wrote :
QUOTE
CDs are inferior to
vinyl in frequency response

I don't agree, that can be true on several records, and I know that some cartridge have a frequency response up to 40 kHz, but you can see the compared spectrums of a CD versus an LP of the same album, at the beginning and end of a 33 rpm side here http://www.hydrogenaudio.org/forums/index....2896#entry28616
At the beginning, all is right, the vinyl has even a bit more high treble than the CD, but near the end, the LP has 9 db of loss at 12 kHz compared to the CD ! The LP was bought brand new, and only played on a Rega Planar 3 with a Denon DL-110 cartridge, exept for the recording of these samples.

Joseph wrote
QUOTE
Most errors are
corrected by CD players, but this can produce tiny glitches of noise
that most people do not notice.

This has been discussed above. I just want to add that if such a thing occurs (if one single error occurs on the audio data), the CD is considered outside red book specifications, and should in theory be discarded at the output of the manufacturing process.

Paranoos wrote
QUOTE
recording artists, such as our friend Billy Corgan, want to record louder and louder albums. What does this accomplish? Our songs are louder on the radio than the competition.

Radio stations have strong dynamics compressors in order to ensure that all CDs play equally loud, however mastered they are. Recording louder accomplishes nothing !

2bdecided wrote
QUOTE
I admit that any 1-bit (LSB) errors would have been missed in this test

Why ? If your input is bit-exact, any LSB error must be detected.

DonP wrote
QUOTE
a couple more kHz on the frequency range would be nice.

Actually, the 44100 Hz frequency was chosen because it allows to store exactly 3 samples of digital audio per line on a monochrome video recorder : http://www.hydrogenaudio.org/forums/index....4949#entry50336

KikeG wrote
QUOTE
I won't believe that 96 KHz sampled music sounds any different than 44.1 KHz sampled music to anyone until I see a rigorously performed blind listening test that proves it, up to this moment I have not seen any.


What's wrong in this one ? http://www.hydrogenaudio.org/forums/index....t=ST&f=1&t=6150
The 44100 Hz 16 bits recording "was picked out very clearly from the majority".
And among the 48 kHz 24 bits ones, there was one that "could not be distinguihsed from original source".
Go to the top of the page
+Quote Post
buzzy
post May 1 2003, 22:33
Post #18





Group: Members
Posts: 203
Joined: 28-July 02
Member No.: 2836



One of the great things about HA is that discussion is usually accurate and on-topic ...

A bigger-picture question, can anyone name an audiophile format that survived in the marketplace? In the dustbin of history are MFSL, quadrophonic sound ... etc.

I don't know that I see how DVD-A and SACD avoid the same fate, especially with 2 formats splitting the market. How often are people going to have the right multi-channel setup around them to take advantage of the format? How often will an album be mastered properly for it, as opposed to using it as a gimmick to sell yet another copy of Who's Next ...?

And then there's the really big picture questions: the music on the discs, and the competition for dollars from video (including music video).

This post has been edited by buzzy: May 1 2003, 22:33
Go to the top of the page
+Quote Post
Uosdwis R. Dewoh
post May 1 2003, 22:41
Post #19





Group: Members
Posts: 148
Joined: 29-September 01
Member No.: 70



QUOTE (Joseph @ May 1 2003 - 04:32 PM)
> Albums released today are so loud that they are clipping CD
> audio... reaching the limit of 16bit. A 24bit medium provides a
greater
> dynamic range, and thus louder recordings.

This is by no means why there is a move to higher bit depth, since an extremely compressed song at any bit depth will sound the same. All the lower bits are essentially unused no matter how many of them there are.

This paragraph tells me you need to read up on the theory quite a bit (pun intended!), and you may even be contradicting yourself. A digital 0 (zero) is always 0. It's not relative, you can't move it around by adding bits. Increasing bit depth only adds bits downwards, not the other way around. So, you are adding nothing but lower bits (17-24 in this case). So, no, using 24 bits does not make for louder recordings, the "ceiling" is not raised.

Depending on how I interpret you, you may also be incorrect: the lower bits are practically always used. In fact, modern "stupid loud" records pretty much use the lowest 14 or 15 bits all the time. But, if you meant that they are not used "dynamically", that's correct, since music material with quiet parts extending more than 50 dB down from peaks is very rare. Hence, you have effectively provided a very strong argument against going further than 16 bits: we don't need more dynamic range! Perhaps not what you intended, but true nevertheless.

Thanks,

uosdwis

Edit: clarity

This post has been edited by Uosdwis R. Dewoh: May 1 2003, 23:39
Go to the top of the page
+Quote Post
KikeG
post May 2 2003, 00:21
Post #20


WinABX developer


Group: Developer
Posts: 1578
Joined: 1-October 01
Member No.: 137



QUOTE (Pio2001 @ May 1 2003 - 09:22 PM)
What's wrong in this one ? http://www.hydrogenaudio.org/forums/index....t=ST&f=1&t=6150
The 44100 Hz 16 bits recording "was picked out very clearly from the majority".
And among the 48 kHz 24 bits ones, there was one that "could not be distinguihsed from original source".

From what I remember when I translated that article, they don't say if the tests were blind, or the listening procedure they used. Also, it seems that the 44.1/16 source was not transparent because added a distinctive, slight but audible, background noise. That suggests that either they were listening at insane levels, or the 44.1/16 device they used had noise problems. Some technical measurements of the devices used would have been useful to complete the test.

This post has been edited by KikeG: May 2 2003, 00:22
Go to the top of the page
+Quote Post
Pio2001
post May 2 2003, 01:12
Post #21


Moderator


Group: Super Moderator
Posts: 3936
Joined: 29-September 01
Member No.: 73



About audible clicks, I must say that many recording have clicks in them, and they seem easier to hear on high end tweeters/headphones.
I must often check that they are recorded in the CD, by playing the CD in the hifi player instead of the mpc file in the computer. They don't come from the CD.

Graduel D'Aliénor de Bretagne - Ensemble Organum, Mercel Pérès, is an example of CD with a lot of clicks everywhere in the recording.
Go to the top of the page
+Quote Post
DonP
post May 2 2003, 01:17
Post #22





Group: Members (Donating)
Posts: 1477
Joined: 11-February 03
From: Vermont
Member No.: 4955



QUOTE (Pio2001 @ May 1 2003 - 03:22 PM)
Actually, the 44100 Hz frequency was chosen because it allows to store exactly 3 samples of digital audio per line on a monochrome video recorder :

Like the Sony PCM. The radio station at the college I went to had one in those pre-cd times.
Did they also need to keep the bits stored per line at a simple (and redundant enough) submultiple of the
line resolution? That is, could they have gone to 4 samples/line = 58800 samples/second?
Go to the top of the page
+Quote Post
Joseph
post May 2 2003, 06:54
Post #23





Group: Members (Donating)
Posts: 108
Joined: 7-April 03
From: Newark, CA
Member No.: 5871



DigitalMan wrote

> 1) Nyquist does not require a perfect LPF.

The Nyquist theorem states that you must digitally sample at a rate
at least twice the highest original frequency you wish to reproduce.
But to merely sample at 2x requires a perfect low-pass filter, since
any signals over this frequency will result in aliasing -- they must
be removed. Since such perfect filters do not exist, one needs to
sample higher still. How high depends on how good a filter you can
build. In the real world all suffer from phase and other distortions.

Nyquist called for a *minimum* of 2x the frequency. Many papers have
shown that the bare minimum is not sufficient. Some are even
available on the web, if you choose to go looking.

But never mind theory. In practice there are many people who can hear
the deficiencies caused by low sampling rates, usually in terms of
loss of clarity in the high end.

> 2) 24 bit dynamic range is 144dB, not 120dB (dynamic range is 6dB x
# of
> bits).

Yes, my bad. I understated my claim. ;-)

> Of course you will not find A/D or D/A converters systems with more
> than 115dB dynamic range, so nobody has ever really recorded a true
24 bit
> signal that I am aware of. Are there microphone amplifiers with
>144dB
> dynamic range? I suspect not.

I suspect not too. But it's nice to remove one source of loss or
distortion from the signal chain, because certainly signals have been
recorded and manipulated that exceed 16-bit. I'd rather overkill than
underkill (if there is such a word).

> In addition, proper mastering of a CD should include noise shaping
which can
> increase the effective midrange dynamic range of a CD by 8 to 12dB
(or
> more), giving an effective dynamic range of over 110dB from Redbook
CD.

Sure, there are many ways to get the most out of CD. I am familiar
with POW-R dithering, etc. I'd rather a format that does not require
all of this finagling but rather delivers bit-for-bit accurate
signals.

> For those that
> can reproduce it, a clean 110dB dynamic range should sound quite
good
> without the need to transition to a completely different audio
format.

Well, there's "quite good" and then there's "indistinguishable from
the master". Personally, I'd go for the latter.

> Yes, I've
> heard extremely high end vinyl reproduction and it sounds very
nice, but
> complete lack of background noise it is not.

And this I never claimed. I was rebutting a claim that over-
exaggerated the noise on vinyl. I have played a record with the
volume turned up between tracks, to unsuspecting listeners. When the
signal hit they jump out of their seats with the change in volume --
and all assumed it was a CD, to give such an effect. There was not
sufficient noise to give away the source. For many people the dynamic
range of vinyl is "good enough", just like 16-bit digital is "good
enough". So be it; I have no interest in debating their musical
enjoyment.

> All phono gear has a pretty steep high pass
> filter to remove strong low frequency resonance of the tonearm /
cartridge
> system and turntable bearing rumble which could damage your
equipment at
> worst and dramatically increase distortion at best.

Low-frequency reproduction is one of the weaknesses of vinyl, but
before the dawn of hyped dance music it was not often a big problem,
except for experimental music.

> The system is very
> robust, and if you commonly encounter audible errors than I suggest
you
> replace damaged CDs or invest in a good quality CD player.

You talk of using a low-end "Techniques" and then say *I* need a
better player. Ha!

> In practice my experience is
> that error correction is not an issue for CD, so I find your claims
to be
> unusual and suspect.

It depends on the CD medium. Metal-stamped CDs are less of a problem
than dye-based CD-Rs. I have examples of these that regularly
demonstrate painful noise that even the most casual listeners can
hear. I am not too sure why you doubt my veracity. I have a degrees I
could wave around but I'd rather not appeal to authority.

> Bottom line - 96/24 may be a good tool for the studio but I do not
see any
> evidence that it is beneficial to the consumer.

That is the problem. You want to *see* the evidence. Open your ears
and you can *hear* it plain enough.

But others tire of this thread, so I will not comment further. Except
that I must wonder why you cannot accept that others are not happy
with the status quo. I don't see how it can possibly hinder your
enjoyment of music if others choose to listen to a different digital
standard.

-- robin
Go to the top of the page
+Quote Post
Miles
post May 2 2003, 08:21
Post #24





Group: Banned
Posts: 86
Joined: 29-September 01
Member No.: 72



QUOTE (Joseph @ May 1 2003 - 05:29 AM)
... you hear unlistenable white noise;

Bravo!
dry.gif
Go to the top of the page
+Quote Post
2Bdecided
post May 2 2003, 10:02
Post #25


ReplayGain developer


Group: Developer
Posts: 5364
Joined: 5-November 01
From: Yorkshire, UK
Member No.: 409



QUOTE (Pio2001 @ May 1 2003 - 08:22 PM)
2bdecided wrote
QUOTE
I admit that any 1-bit (LSB) errors would have been missed in this test

Why ? If your input is bit-exact, any LSB error must be detected.

Detected, yes - but in the context of the mp3 decoder tests, I was usually ignoring LSB errors, so, if any were present due to CD reading or writing problems, I would have missed them.


This is probably more relevant. I have done the following process, but only once, so I didn't mention it before:

1. Rip a track from a commercial CD, call this A.wav
2. Record the same track from the commercial CD via CD player digital out > sound card digital in, call this B.wav
3. Take A.wav, and burn it to CD-R, audio format.
4. Player the CD-R in the CD player, and record it (as in step 2), call the result C.wav

I verified that (after time synchronisation) A.wav, B.wav and C.wav were identical including the LSB.

This is a very useful test. It verifies that (at least once) ripping, burning, playing, and digital recording are all bit-perfect. Back in 1997 (when I did this test) this wasn't a trivial thing to acheive - mainly because much of the information about resampling soundcards and accurate ripping was less common on the web than it is today. However, I managed it back then, so I'm sure it's quite easy now.

Cheers,
David.
Go to the top of the page
+Quote Post

2 Pages V   1 2 >
Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 



RSS Lo-Fi Version Time is now: 29th December 2014 - 04:23