IPB

Welcome Guest ( Log In | Register )

8 Pages V  « < 4 5 6 7 8 >  
Reply to this topicStart new topic
128kbps Extension Test - OPEN
upNorth
post Jul 25 2003, 19:19
Post #126





Group: Members
Posts: 1099
Joined: 18-March 03
From: Oslo, Norway
Member No.: 5569



QUOTE (atici @ Jul 25 2003, 07:31 PM)
I haven't read in detail how this test is performed. But in general isn't it a better idea for testing different codecs to fix a bitrate and adjust the quality level (for mpc and vorbis) so that the output is going to be exactly equal to that bitrate? This way every codec will be given the same amount of space to demonstrate their skills. Let's say 128kbps lame, q4.3 MPC, q4.5 vorbis for a specific sample, but 128kbps lame, q5 mpc, q5.5 vorbis for another sample...

What would this prove?
The beauty of a quality setting is that it's the bitrate that changes and not the quality. I still think that what matters, is what kind of bitrates a certain codec at a certain setting, average to in the long run.

At least when I encode music I settle for a specific quality setting (currently MPC -q5) and not a bitrate. Do you really test every track to find the setting that is closest to your desired bitrate, and thereby end up with an album where "all" the tracks are encoded with different settings, just so that all of them has about the same average bitrate? It's fine by me if this gives others the warm fuzzy feeling everyone here talk about, but I don't get it. smile.gif

As I see it, a test like this has to settle for some interesting samples to put to the test because you have to limit the amount. I would expect the easiest to spot problems parts of a certain track also might be the same places where a good VBR codec truly shines. If you force it not to be smart and use bits where it thinks they should be used, your test IMHO won't be much worth.

I don't say that this is easy at all, but at least when it's done in this way it applies to real life usage. I don't really care what these Slashdot people say, it's only fun to read it... tongue.gif
Go to the top of the page
+Quote Post
verloren
post Jul 25 2003, 20:17
Post #127





Group: Members
Posts: 156
Joined: 28-March 02
From: Hants, UK
Member No.: 1637



I have almost the opposite question to most people. I understand and totally agree with the reasoning that roberto has used for setting the various quality levels. But from the accounts listed here it seems like there are many samples that are way above the 128kbps nominal value, but few if any that are significantly below (the examples I've seen have been around 122kbps for example).

So I wonder if it would be useful to give some really easy to encode samples, to make sure that when the encoder decides it only needs say 50kbps it is making as good a decision as when it picks 190kbps for a hard passage.

And no, I haven't downloaded the samples as I lack the facilities, so perhaps this is already in there! If so I claim "official mirror's" right to ask one stupid question smile.gif

Cheers, Paul
Go to the top of the page
+Quote Post
ff123
post Jul 25 2003, 20:39
Post #128


ABC/HR developer, ff123.net admin


Group: Developer (Donating)
Posts: 1396
Joined: 24-September 01
Member No.: 12



QUOTE (verloren @ Jul 25 2003, 11:17 AM)
I have almost the opposite question to most people.  I understand and totally agree with the reasoning that roberto has used for setting the various quality levels.  But from the accounts listed here it seems like there are many samples that are way above the 128kbps nominal value, but few if any that are significantly below (the examples I've seen have been around 122kbps for example).

So I wonder if it would be useful to give some really easy to encode samples, to make sure that when the encoder decides it only needs say 50kbps it is making as good a decision as when it picks 190kbps for a hard passage.

And no, I haven't downloaded the samples as I lack the facilities, so perhaps this is already in there!  If so I claim "official mirror's" right to ask one stupid question smile.gif

Cheers, Paul

I would guess that the high and low bitrates are not distributed the same way. For example, if a codec spends 90 percent of its time at 124 kbit/s, then 10 percent of the time it could grow to 165 kbit/s while still averaging 128 kbit/s overall. It could be that the VBR codecs never let the bitrates dip down to the extent that they're allowed to increase.

If that's the case (probably a reasonable assumption), and a test suite were to be comprised completely of random samples of music (not chosen at all for degree of difficulty), then most of the time it might be completely transparent to the listeners, and basically useless for trying to discriminate between codecs because of the large number of samples which would be required to simulate a real-world music collection.

One type of music which seems to produce lower bitrates is solo piano. Roberto mentioned in his first test that he removed this from the test suite because it didn't discriminate well on the 64 kbit/s test. This implies that the codecs would indeed be transparent at lower bitrates. Still, maybe one or two "very easy" to encode samples, which produce lower VBR bitrates might be a good thing to include in a future test just to make sure one of the VBR codecs isn't failing badly at those bitrates.

ff123
Go to the top of the page
+Quote Post
askoff
post Jul 25 2003, 21:00
Post #129





Group: Members
Posts: 445
Joined: 23-December 02
Member No.: 4214



QUOTE (JohnV @ Jul 25 2003, 05:18 AM)
How do you know that it's 50% exta of what the original bitrate was targeted to be, you are testing just short hard-to-encode clips. The targeted quality setting tested produces 128kbps average, and we are testing the quality of specific quality setting of a vbr codec which gives this average bitrate. We are not testing qualities of 12 different quality settings of one vbr codec in one test.

This is quite odd. Even in the name of this topic clearly says "128kbps Extension Test" and nothing about "128kbps average album quality extension test clips". Well what the hec i'm whining about this subject anymore. I gues nothing will be changed anymore in this test, so i just have to do this how it has been started and try to test later in my way with my self. And why not setup my own public test. After all this is only Roberto's test, and it's as official as anyone else public test. Not the only official test.
Go to the top of the page
+Quote Post
rjamorim
post Jul 25 2003, 21:42
Post #130


Rarewares admin


Group: Members
Posts: 7515
Joined: 30-September 01
From: Brazil
Member No.: 81



If anyone is willing to create his own listening test, I would be very happy to help him set it up. I'm pretty sure ff123 would also be happy.


And no, this test won't be changed. It works the way it is, and lots of people already took it, I won't ask them to retake (specially since there's no reason, really).


--------------------
Get up-to-date binaries of Lame, AAC, Vorbis and much more at RareWares:
http://www.rarewares.org
Go to the top of the page
+Quote Post
rpop
post Jul 25 2003, 21:54
Post #131





Group: Super Moderator
Posts: 332
Joined: 20-May 03
From: Pittsburgh, USA
Member No.: 6718



To all the people whining: where were you during the pre-test dicussion? Don't you think these comments would've been more appropriate and helpful then?


--------------------
[url=http://noveo.net/ph34r.htm]Happiness[/url] - The agreeable sensation of contemplating the misery of others.
Go to the top of the page
+Quote Post
verloren
post Jul 25 2003, 21:59
Post #132





Group: Members
Posts: 156
Joined: 28-March 02
From: Hants, UK
Member No.: 1637



QUOTE (ff123 @ Jul 25 2003, 01:39 PM)
I would guess that the high and low bitrates are not distributed the same way.  For example, if a codec spends 90 percent of its time at 124 kbit/s, then 10 percent of the time it could grow to 165 kbit/s while still averaging 128 kbit/s overall.  It could be that the VBR codecs never let the bitrates dip down to the extent that they're allowed to increase.

Thanks for the response ff123, that sounds very plausible.

Cheers, Paul
Go to the top of the page
+Quote Post
westgroveg
post Jul 26 2003, 02:03
Post #133





Group: Members
Posts: 1236
Joined: 5-October 01
Member No.: 220



QUOTE (spoon @ Jul 25 2003, 10:07 AM)
Still have my priviledge smile.gif

QUOTE
Now, tell me what can you say overall about the average 128kbps quality (certain quality setting) based on results from those? Or actually any useful result.. I'd like to know...


The reason VBR exists is that a codec can be advanced enough to lower its bitrate and up its bit rate, but for this to be a fair test - especially with different sample types - for all we know on the harpsicord: ogg will go to an average of 200Kbps, whilst WMA goes to 128Kbps. Now you could say, tough luck WMA for not matching Ogg when it goes to 200Kbps, but you could also say that WMA is better programmed because it stays within its quality range and does not vary wildly. *** these codecs and numbers are totally made up ***

I am thinking then, if the codec has ABR avaliable it should be used in preferrence to VBR in this type of test.

The way I see it WMA would have failed to adapt & keep the selected QUALITY level.
Go to the top of the page
+Quote Post
loophole
post Jul 26 2003, 08:28
Post #134





Group: Members
Posts: 273
Joined: 18-June 03
Member No.: 7254



QUOTE (ff123 @ Jul 25 2003, 05:28 AM)
QUOTE
Also, Compressor which comes with Final Cut Pro 4 seems to have VBR options for AAC, not just ABR.

(for those who think QuickTime Pro is CBR, it isn't - it's ABR)


The purpose of the last test was to find the AAC codec to be used for this test. Final Cut Pro 4 wasn't tested.

Final Cut Pro (by Apple) actually leverages QuickTime (also by Apple), it just displays a different interface which seems to allow VBR modes.
Go to the top of the page
+Quote Post
ezra2323
post Jul 26 2003, 19:22
Post #135





Group: Members
Posts: 586
Joined: 17-July 02
Member No.: 2631



Roberto - just curious, whay was WMA 9 not included, only the PRO version? Very few people here use WMA to begin with, and those that do are most likely using it because it has excellent hardware support. WMAPRO does not. I have tried to load these files on to my WMA compliant portables and they are not recognized.

However, I would be very interested to see how 2 pass VBR 128 WMA (not the professional version) stacks up against the competition since this is a very popular format with the new legitimate music sites popping up. Yes, I know they likley are using WMA CBR 128, but could probably be convinced to swithch to 2 pass 128 VBR if the quality gain was sufficient.

It would be interesting to see how the WMA offerings stack up against Apple's AAC offering.
Go to the top of the page
+Quote Post
ezra2323
post Jul 26 2003, 19:24
Post #136





Group: Members
Posts: 586
Joined: 17-July 02
Member No.: 2631



BTW - not questioning the test, I think its great! Just requesting WMA (regular, not pro) be added
Go to the top of the page
+Quote Post
guruboolez
post Jul 26 2003, 21:06
Post #137





Group: Members (Donating)
Posts: 3474
Joined: 7-November 01
From: Strasbourg (France)
Member No.: 420



A public blind test can't include too much encodings. It will make the notation (and hierarchy) harder...
I did few tests between WMA std and WMA pro (one, on the 12 samples, is available on the OTHER AUDIO FORMATS section). In my mind, the gap between STD and PRO encoder is consequent. Too consequent maybe for keeping any hope on standard codec quality, against stronger formats. On the other side, WMApro is more mysterious. No test are available. No mention on its quality on Hydrogenaudio. How good is it ? Can this new format, created and supported by a giant, compete with Goliath MP4 or David Vorbis, in quality term ? We had to let this format a chance, and to test it, against the best challengers of the moment.

This test include the best formats available, and for each of them, the best codec at the best setting. Only exception (easy to understand) : mp3. It would be interesting to include wma standard format, but then, why not atrac3 ? Fraunhofer Fastenc VBR ~128 ? VQF 2.0... As I said before, a public test couldn't include too much challengers. Some choice were made, with dialogue. IMO, Roberto did the good one. Other people will be disappointed. That's life...

Nevertheless, if you're interested by wma standard performance, you can easily include yourself some encodings in each downloaded package.


About hardware support : I give more chance to WMApro to be widely support in the next two years on DVD/Portable than to vorbis. I hope to be wrong...
Go to the top of the page
+Quote Post
phong
post Jul 27 2003, 21:30
Post #138





Group: Members
Posts: 346
Joined: 7-July 03
From: 15 & Ryan
Member No.: 7619



QUOTE (rjamorim)
Added your comments (actually, did some copy-pasting  hope you don't mind).

Honored. :)

Quick question... On a couple of the samples, I'm not having too much trouble abxing most or all of the codecs, but others are much harder for me (no surprise obviously). If time is a limiting factor (I may only be able to set aside a small amount of time this week), which of the following would you prefer people to do:
a) A very careful analysis of a few of the samples, making every reasonable effort to distingush as many from the originals as is possible with their equipment and ears.
B) Try to do all 12 samples, at the expense of a few of the best encodings getting rated as "perfect" where more careful analysis would reveal some minor audible defects in some of them.
c) If you can't do your most careful analysis of all 12 samples, don't bother submitting results at all.
d) Whatever floats one's boat. Have fun and don't stress too much.

From my interpretation of the readme, I doubt c) is the case. :) If a) is prefered, do you have a prefered method of chosing the samples? Go down the list in order? Pick randomly? Do the easiest ones, thereby providing the most possible discriminating data?

This post has been edited by phong: Jul 27 2003, 21:49


--------------------
I am *expanding!* It is so much *squishy* to *smell* you! *Campers* are the best! I have *anticipation* and then what? Better parties in *the middle* for sure.
http://www.phong.org/
Go to the top of the page
+Quote Post
rjamorim
post Jul 27 2003, 22:19
Post #139


Rarewares admin


Group: Members
Posts: 7515
Joined: 30-September 01
From: Brazil
Member No.: 81



QUOTE (phong @ Jul 27 2003, 05:30 PM)
From my interpretation of the readme, I doubt c) is the case.  smile.gif  If a) is prefered, do you have a prefered method of chosing the samples?  Go down the list in order?  Pick randomly?  Do the easiest ones, thereby providing the most possible discriminating data?

Yes, c) definitely isn't the case. smile.gif

It's really a matter of whatever floats your boat. Both a) and B) suit me well. And if you decide to go with a), I suggest taking files randomly. If people only do going through the order, I'll have too many 41_30sec samples and maybe too few of others, as happened on the AAC test. smile.gif

Thanks for participating.

Regards;

Roberto.


--------------------
Get up-to-date binaries of Lame, AAC, Vorbis and much more at RareWares:
http://www.rarewares.org
Go to the top of the page
+Quote Post
rjamorim
post Jul 27 2003, 22:20
Post #140


Rarewares admin


Group: Members
Posts: 7515
Joined: 30-September 01
From: Brazil
Member No.: 81



Damned be that stupid B) smilie.

Couldn't some admin please replace it with :cool: or something?


--------------------
Get up-to-date binaries of Lame, AAC, Vorbis and much more at RareWares:
http://www.rarewares.org
Go to the top of the page
+Quote Post
rjamorim
post Jul 28 2003, 17:16
Post #141


Rarewares admin


Group: Members
Posts: 7515
Joined: 30-September 01
From: Brazil
Member No.: 81



Hello.

Tonight I'll have to pull the files from Paul's mirror because it's reaching the bandwidth consumption limit of 5Gb.

So, I would like to ask if someone with a reasonably fast server could spare me some 20Mb and some few Gb of bandwidth so that I can keep the files there until sunday. It consumed 5Gb these first 5 days, so I expect to consume as much until the end of the test.

If you can, please PM or mail me. You don't even need to give me login/password, just upload the packages to your server and send me the addresses.

Thank-you very much;

Roberto.


--------------------
Get up-to-date binaries of Lame, AAC, Vorbis and much more at RareWares:
http://www.rarewares.org
Go to the top of the page
+Quote Post
ff123
post Jul 28 2003, 17:30
Post #142


ABC/HR developer, ff123.net admin


Group: Developer (Donating)
Posts: 1396
Joined: 24-September 01
Member No.: 12



I'm uploading the samples now to

http://ff123.net/128exten/Samplexx.zip

I have about 12 GB bandwidth to spare.

I have to leave for work, but they should be uploaded within the next half hour or so. I'll verify that sample12.zip uploaded properly from work.

ff123

Edit: changed the path

This post has been edited by ff123: Jul 28 2003, 17:53
Go to the top of the page
+Quote Post
verloren
post Jul 28 2003, 17:31
Post #143





Group: Members
Posts: 156
Joined: 28-March 02
From: Hants, UK
Member No.: 1637



I've also made some more space available to Roberto - who knew it would be quite this popular! I'm sure he'll let you know the details if he decides to use the space.

Cheers, Paul
Go to the top of the page
+Quote Post
rjamorim
post Jul 28 2003, 17:37
Post #144


Rarewares admin


Group: Members
Posts: 7515
Joined: 30-September 01
From: Brazil
Member No.: 81



Wow. Thanks a lot, both of you smile.gif


--------------------
Get up-to-date binaries of Lame, AAC, Vorbis and much more at RareWares:
http://www.rarewares.org
Go to the top of the page
+Quote Post
puntloos
post Jul 28 2003, 17:43
Post #145





Group: Members
Posts: 149
Joined: 20-July 03
Member No.: 7881



I'm happy to set up some space on my own server as a backup mirror. I have, well, virtually unlimited bandwidth to spare on my gigabit uplink wink.gif

If needed just contact me.

Also one question, I've read most of the comments here but didnt see it (I think)

Why not penalize 'naughty' codecs (say ogg using -q4 generating a 190kbit file) by just multiplying its end score for a particular file with 128/190? Sounds reasonable to me but maybe I'm overlooking something big here. In the end I 'respect' that a codec rightly thinks it needs to use 190kbit for a particular piece to keep the 'Quality level 4' there, but indeed when comparing it to a codec that is perhaps just as advanced but less frivoulous with bitrate allocating isnt fair.

"Yay, vorbis wins all tests by allocating 256kbit continuous for all samples)

p.s. just using vorbis as an example, feel free to replace that word with <your hated codec name here>

(uh for the record (people are using the term 'bandwidth' kinda loosely here).. I can easily deal with sending out 100Gb of data over a week. I can not deal with sending out 1Gb/second smile.gif )

This post has been edited by puntloos: Jul 28 2003, 17:50
Go to the top of the page
+Quote Post
AstralStorm
post Jul 28 2003, 18:33
Post #146





Group: Members
Posts: 745
Joined: 22-April 03
From: /dev/null
Member No.: 6130



Sorry, but the test is frozen. JohnV will eat you alive! (either he or his tiger) tongue.gif

This post has been edited by AstralStorm: Jul 28 2003, 18:35


--------------------
ruxvilti'a
Go to the top of the page
+Quote Post
puntloos
post Jul 28 2003, 18:47
Post #147





Group: Members
Posts: 149
Joined: 20-July 03
Member No.: 7881



QUOTE (AstralStorm @ Jul 28 2003, 09:33 AM)
Sorry, but the test is frozen. JohnV will eat you alive! (either he or his tiger) tongue.gif

Ah but Im not asking to change the test smile.gif

I am just suggesting perhaps it would be interesting to see if the outcome of the test would differ in any meaningful way using the 'puntloos audio correction factors' biggrin.gif
Go to the top of the page
+Quote Post
guruboolez
post Jul 28 2003, 22:27
Post #148





Group: Members (Donating)
Posts: 3474
Joined: 7-November 01
From: Strasbourg (France)
Member No.: 420



QUOTE (puntloos @ Jul 28 2003, 05:43 PM)
Why not penalize 'naughty' codecs (say ogg using -q4 generating a 190kbit file) by just multiplying its end score for a particular file with 128/190? Sounds reasonable to me but maybe I'm overlooking something big here.

Why "penalize" ? A VBR codec had to put more bits on complex signals. The 12 samples are complex, difficult : it's nonsense to expect an average bitrate close to 128 kbps, and completely stupid to punish a VBR codec for doing correctly his job.

I'm happy to see (or hear) mpc putting ~700 kbps frame on complex signals like castanets. I was glad to use --preset standard, introducing a lot of 320 kbps frame when needed, and compensate it on quiet/easy part. Every people here are using great format or setting. Each are VBR : that mean consequent variations, but at the end, an average bitrate -the same for most albums. Is there one reason to applaud on their qualily for listening purpose, and blame or punish them for testing ?
Go to the top of the page
+Quote Post
bond
post Jul 28 2003, 22:30
Post #149





Group: Members
Posts: 881
Joined: 11-October 02
Member No.: 3523



am i right that the test still hasnt been announced on slashdot?


--------------------
I know, that I know nothing (Socrates)
Go to the top of the page
+Quote Post
puntloos
post Jul 28 2003, 22:57
Post #150





Group: Members
Posts: 149
Joined: 20-July 03
Member No.: 7881



QUOTE (guruboolez @ Jul 28 2003, 01:27 PM)
QUOTE (puntloos @ Jul 28 2003, 05:43 PM)
Why not penalize 'naughty' codecs (say ogg using -q4 generating a 190kbit file) by just multiplying its end score for a particular file with 128/190? Sounds reasonable to me but maybe I'm overlooking something big here.

Why "penalize" ? A VBR codec had to put more bits on complex signals. The 12 samples are complex, difficult : it's nonsense to expect an average bitrate close to 128 kbps, and completely stupid to punish a VBR codec for doing correctly his job.

I'm happy to see (or hear) mpc putting ~700 kbps frame on complex signals like castanets. I was glad to use --preset standard, introducing a lot of 320 kbps frame when needed, and compensate it on quiet/easy part. Every people here are using great format or setting. Each are VBR : that mean consequent variations, but at the end, an average bitrate -the same for most albums. Is there one reason to applaud on their qualily for listening purpose, and blame or punish them for testing ?

Oh but as I said: I agree with a good VBR codec allocating HIGH amounts of bits to complex pieces, no problem there.

My dad taught me to always think in extremes when it comes to physics, so:

Suppose we have a piece with ONLY castanets? And MPC at -q4 (say) would create a 700kbps average file. Would you consider it fair to compare that file to (say) Vorbis that has encoded the same castanets file to 140kbps? Youpi! MPC file sounds better!! dry.gif

My point therefore is that even though VBR is a very valid way to encode music, and I have no problem at all if some codec I use goes above the 'indicated bitrate' if it feels it needs to. But when comparing these results I think a -certain- penalty must be given to the 700kbps output file. Im sure you agree that comparing the QUALITY of a file that averages at 700kbps with a file averaging at 140kbps isnt fair and will give skewed results when you try to determine the 'best codec'. I'm not a mathematician so Im not sure if my way of unskewing these results (multiply the 'score' of the 700kbps file with 128/700) is completely fair, but at least it will give results that closer matches 'fairness' in my mind.
Go to the top of the page
+Quote Post

8 Pages V  « < 4 5 6 7 8 >
Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 



RSS Lo-Fi Version Time is now: 1st September 2014 - 21:12