IPB

Welcome Guest ( Log In | Register )

 
Reply to this topicStart new topic
encoders vibrant, decoders stagnant?
krabapple
post Mar 19 2012, 08:23
Post #1





Group: Members
Posts: 2271
Joined: 18-December 03
Member No.: 10538



Care to fact-check this post for me, HA? (Though I'd understand if you didn't want to go down the rabbit hole with that particular poster):

"
While encoders have been revised and worked on for a long time, decoders tend to be quite stagnant. Reason is that the standard dictates the stream that they must be able to decode and leaves little room for innovation (although as I noted, the implementation is not). Whereas the standard says nothing about how you encode so folks keep messing with that end of it.

If you have ever been in charge of shipping such decoders in an application, you go and license an implementation (and rarely, write your own) and once initial bugs are out of the way you ship it forever. I reckon the MP3 decoder in Windows Media Player is now going on ~15 years now! So few people know about such differences or care in the midst of compression artifacts that no one wants to put resources to make them better. An exception was WMA which we developed in house so we sometimes fine tuned it.

So while you do want to look at the age and revision of encoders when you look at their performance data, such worry is not usually well placed with it comes to decoders. You likely are running pretty "dusty" code whether you realize it or not ."


Sparked by the posting of this chart (from circa year 2000) as proof of something.

http://www.avsforum.com/avs-vb/showthread....49#post21791249

This post has been edited by krabapple: Mar 19 2012, 08:26
Go to the top of the page
+Quote Post
Garf
post Mar 19 2012, 10:06
Post #2


Server Admin


Group: Admin
Posts: 4885
Joined: 24-September 01
Member No.: 13



The MP3 and AAC (and for that matter, H264) standards are defined in terms of their decoding process. They describe *precisely* how it must be done, and provide reference implementations and decoded reference files. A correctly working decoder is then one that decodes to within a certain accuracy of the reference one, with this accuracy of course so chosen that there'd be no audible differences.

So the post you quote is right. Once you have a working decoder, the only thing you can improve in it is the speed.

In practice, it can also be interesting to improve the handling of corrupted files, so your application doesn't crash if it encounters them, and recovers as quickly as possible. But of course how you handle corrupted files has nothing to do with the standard.
Go to the top of the page
+Quote Post
Arnold B. Kruege...
post Mar 19 2012, 13:01
Post #3





Group: Members
Posts: 3797
Joined: 29-October 08
From: USA, 48236
Member No.: 61311



[quote name='krabapple' date='Mar 19 2012, 03:23' post='789863']
Care to fact-check this post for me, HA? (Though I'd understand if you didn't want to go down the rabbit hole with that particular poster):

"
While encoders have been revised and worked on for a long time, decoders tend to be quite stagnant. Reason is that the standard dictates the stream that they must be able to decode and leaves little room for innovation (although as I noted, the implementation is not). Whereas the standard says nothing about how you encode so folks keep messing with that end of it.
"

I think that most of us know the doctrine - the encoders are where the opportunities for smart designers is, while the stability and consistency of the operation and performance of the decoders is actually the foundation of the whole process.

This chart asserts that it is a summary of similarities and differences among a wide selection of decoders:



This chart was presented to me by a third party as proof that different coders provide audibly and measurably different results. I read the table author's fine print and find that the vast majority of the differences he found amounted to variations on the order of one or two bits, or occasional dropped samples. I did a little math and found that differences of one or two bit levels (of the possible approximate 65,000) equates to about 0.003% distortion. No big deal. While adding a random sample every once in a while can result in audible clicks and pops, dropping an occasional sample is audibly moot unless you do a side-by-side comparison or enough samples are dropped to audibly change playback speed. Differences might cause an echo to be heard when you do a quick comparison. In normal listening, not so much.

IOW, the differences found are generally audibly moot unless its involves a relatively massive number of dropped samples - on the order of 1 in 1,000 to 1 in 100. OTOH, a tiny minority of decoders might be sonic bad news.

Having written code of a similar nature leaves me somewhat amazed that the numerical results are actually this consistent. IME, my analysis suggests that people are not only copying the operational principles but also the exact order and kind of operations. All things considered I have no problems with that.
Go to the top of the page
+Quote Post
DonP
post Mar 19 2012, 13:23
Post #4





Group: Members (Donating)
Posts: 1471
Joined: 11-February 03
From: Vermont
Member No.: 4955



I've submitted a few bug reports where decoders didn't work correctly. The most common was not working with monophonic encodes. Some of them would split the stream of samples to both channels resulting in chipmonking (double speed,double frequency).

For portables there could be a question of deliberate compromises for battery life or other hardware constraints. For a while Iriver players couldn't play low rate vorbis files, I think because of limited ram.
Go to the top of the page
+Quote Post
krabapple
post Mar 19 2012, 18:37
Post #5





Group: Members
Posts: 2271
Joined: 18-December 03
Member No.: 10538



the chart originates from here

http://mp3decoders.mp3-tech.org/objective.html

re: audibility, the creator of that site had this to say circa 2000

QUOTE
What are the important results of this test?

ACM 1999, CDex, CEP FhG, Easy CD creator, lame, Media Jukebox, Quicktime, Real Jukebox (writing to CD-R), Siren, Ultra Player, Winamp 2.22, Winamp 2.7, the Winamp MAD plug-in, and the Winamp mpg123 plug-in pass this test.
l3dec comes close!
There may be only two or three successful decoding engines, duplicated in several products.
mp3 decoder was the worst decoder on test.


Do we care?!
mp3 decoder gave terrible results, often loosing all frequencies below 700Hz. The result sounds like your hi-fi has broken!
Other decoders with an audible problem below 15kHz (shown in red) usually exhibit the 100Hz bug. Problems above 15kHz (shown in yellow) are probably not audible to most people, even under optimum listening conditions. Differences in the last bit (shown in light blue) are not audible to most people, and it is difficult to say that one decoder is better than another in this case.



The salient question (to me) is whether any of this is of forefront concern today, 12 years later. To me, it's only an issue when someone makes a patently dubious claim about audibility of their lossy encodes (e.g, 'I/evenmyspouse can hear the difference between 320 kbps and the original.') -- one thing on the list of things to be checked is the codec, but it would not be the first thing I'd check ('sightedness' is usually the prime culprit).

(The AVS forum poster has a record of using graphics grabbed off the web to make dispositive points about lossy perceptual codecs, that aren't really supported by the graphics he cites. Hence the fact-check.)

This post has been edited by krabapple: Mar 19 2012, 18:43
Go to the top of the page
+Quote Post
2Bdecided
post Mar 19 2012, 19:30
Post #6


ReplayGain developer


Group: Developer
Posts: 5135
Joined: 5-November 01
From: Yorkshire, UK
Member No.: 409



It's my chart. It dates from a time when Winamp (extremely popular at the time) had just switched to a bug-free decoder. Lots of politics and licensing issues around it at the time. Ancient history now.

FWIW the decoders that scored 7 are closer to the ISO reference decoder. Those that score 6 are the ones with some minutely different scaling internally. I can't imagine how most people would measure the difference, never mind hear it. (Encoding a sine wave at a high bitrate is a reasonable way to catch it, but you have to look past the inevitably huge changes due to lossy encoding to spot this tiny change).

I'm not repeating this test to see how things have progressed.

The 24-bit test from the same set of tests is more interesting...
http://mp3decoders.mp3-tech.org/24bit.html *
...a few mp3 decoders are measurably more linear / distortion-free down to a ridiculous number of bits. You can hear the result here:
http://mp3decoders.mp3-tech.org/24bit2.html

You can't hear the difference at normal listening levels. Of course you can hear these differences with the right test signal, while listening to a near-silent part, if you crank the volume control up by 60dB above (what would normally be) deafening levels.

Hope this helps.

Cheers,
David.

EDIT: * - I apologise for the statement that 24-bit audio "sounds more realistic" on that page! It was written in 2000, before I'd (failed to) ABX any. However, the statements about distortion (rather than noise) are true, since most of the decoders didn't dither, and not all the encoders preserved dither.

EDIT2: The original discussion linked at the top of this thread verges on nonsense - given that l3dec (the original mp3 decoder from 1994!) was bug-free, and shown to be accurate down to more than 26-bits, I'm not entirely sure what Amir would like to see improved.

This post has been edited by 2Bdecided: Mar 19 2012, 19:43
Go to the top of the page
+Quote Post
Garf
post Mar 19 2012, 20:12
Post #7


Server Admin


Group: Admin
Posts: 4885
Joined: 24-September 01
Member No.: 13



QUOTE (Arnold B. Krueger @ Mar 19 2012, 13:01) *
This chart was presented to me by a third party as proof that different coders provide audibly and measurably different results.


It should be noted that anything that didn't score 6 or more in that table cannot be called an MP3 decoder.
Go to the top of the page
+Quote Post
Porcus
post Mar 19 2012, 22:28
Post #8





Group: Members
Posts: 1842
Joined: 30-November 06
Member No.: 38207



QUOTE (Garf @ Mar 19 2012, 20:12) *
QUOTE (Arnold B. Krueger @ Mar 19 2012, 13:01) *
This chart was presented to me by a third party as proof that different coders provide audibly and measurably different results.


It should be noted that anything that didn't score 6 or more in that table cannot be called an MP3 decoder.


Assuming that the encoder on that column actually did produce proper mp3's, right?


--------------------
One day in the Year of the Fox came a time remembered well
Go to the top of the page
+Quote Post
benski
post Mar 19 2012, 23:33
Post #9


Winamp Developer


Group: Developer
Posts: 670
Joined: 17-July 05
From: Brooklyn, NY
Member No.: 23375



When I re-ran a similar test years ago (and posted results on HA) with 24-bit output, I found that the differences between Foobar and Winamp were, peak, at the 19th bit, and RMS at 20bits. The differences are likely due to floating point precision issues. This is from two decoders with entirely different implementations (to my knowledge). This should give you some idea about how much variation there is amongst properly implemented decoders - NONE
Go to the top of the page
+Quote Post

Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 



RSS Lo-Fi Version Time is now: 18th September 2014 - 01:03