Welcome Guest ( Log In | Register )

post Dec 16 2012, 01:50
Post #1

Group: Members
Posts: 143
Joined: 14-December 12
Member No.: 105171

How come u-law was not used with high sample rates in the old days of computing? A quick search here says u-law is not good for sampling above 16 khz because of high frequency distortion, but when I encoded a 320 kbps mp3 44.1 khz into an 8-bit u-law 44.1 khz format it actually sounded quite ok especially compared to 8-bit linear PCM and even 4-bit ADPCM. So how come it was not used in games and multimedia at good sample rates before mp3 came? I understand that it distorts the sound slightly however this distortion is much less hearable on my samples created with audacity than in the same sample encoded to either linear 8-bit PCM or 4-bit ADPCM. And that encoded music was a 1980s OMD track with high dynamic range, not a modern day overcopressed/clipped sample. I am kind of interested in old formats and computing, so that's why I'm asking. I would insert short sound samples, but I don't know how to add them on this forum.

EDIT - I uploaded some sample files here http://www.hydrogenaudio.org/forums/index....showtopic=98358 .

This post has been edited by Neuron: Dec 16 2012, 02:09
Go to the top of the page
+Quote Post
Start new topic
post Dec 18 2012, 09:37
Post #2

Group: Members
Posts: 38
Joined: 4-January 08
Member No.: 50127

A sine wave sweep is the worst possible sample you could put through ADPCM, especially the high frequencies (I didn't even listen to the samples, I know how bad sine sweeps get). It will be dead obvious how bad ADPCM performs on those. The purer the tone the more noticeable the ADPCM prediction error.

This is what I meant by listening too critically ... video games (and most music) aren't filled with sine sweeps. Put some rock music and crunchy sound effects through and see how it does. Back in the day most sound designers compressed the hell out of everything with L2 plugins (probably a holdover from 8-bit samples where it made a big difference). Then everything is getting nearest neighbour (actually not even that good) resampled and lots of clipping on the final downmix on top of all that. There was crunchy audio all around so the ADPCM noise itself did not stand out as much as you would think. Funny thing is that when games finally started mixing at 16-bit 44.1 kHz they were marketed as having "CD QUALITY SOUND!" ... that's pretty laughable in retrospect, but there is some very remote truth in that statement.

I'm not trying to defend ADPCM as some great thing, just giving a context to what it was like back then and why u-law was not common. I was very happy when we moved on to MP3 and the like.

PS - u-law wasn't free of CPU usage (barring hardware decode support, which no computer has). There is a table look up required to decode each sample. IMA ADPCM wasn't too much more complicated in comparison. An Amiga 1000 is still going to chew up quite a bit of CPU decoding u-law (now I'm curious how much). Plus the Amiga only outputs 8-bit audio so the 12-bit u-law result isn't immediately useful ... unless you gang 2 channels together to create 14-bit, but then there's extra CPU needed for shuffling the data around for all that. Hmmm got an Amiga 1000 sitting behind me, maybe I'll try that one day ...

Go to the top of the page
+Quote Post

Posts in this topic

Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:


RSS Lo-Fi Version Time is now: 27th November 2015 - 12:38