Welcome Guest ( Log In | Register )

Bit-perfect AAC/MP3/etc decoding?, Is it an issue?
post Jan 5 2013, 00:43
Post #1

Group: Members
Posts: 12
Joined: 30-May 05
Member No.: 22401

Sorry if this is easily answered elsewhere.

My question is that in a comparison of lossy-format decoders between, let's say, iTunes and Foobar and Winamp, will the output (edit: of a single source file) vary between decoding implementations?

Or is it that once a decoder follows a specified format, all output is equal assuming there are no bugs?

This post has been edited by mavere: Jan 5 2013, 00:51
Go to the top of the page
+Quote Post
Start new topic
post Jan 5 2013, 22:56
Post #2

Group: Super Moderator
Posts: 5275
Joined: 23-June 06
Member No.: 32180

Sorry, I did read it wrongly.

RMS level [of the then-latest version of MAD] is over a hundred times the one produced by Apollo 37zm and the maximum difference is four times the one by Apollo (interestingly the results for MAD 0.11.4b seem to be somewhat better than for the latest one but they are still worse than Apollo's). Actually, the maximum difference of Apollo's output is the smallest possible deviation in 24-bit data, the only smaller possible value would be zero.

So, you’re right; this suggests that MAD produced an RMS between 6–7 bits larger in magnitude than Apollo did, and (I guess) a maximal difference 2 bits larger in magnitude.

I apologise for talking nonsense. :/
Go to the top of the page
+Quote Post

Posts in this topic

Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:


RSS Lo-Fi Version Time is now: 28th November 2015 - 18:05