IPB

Welcome Guest ( Log In | Register )

2 Pages V   1 2 >  
Reply to this topicStart new topic
lossless codec testing, how do we know XYZ is lossless?
jcoalson
post Nov 12 2006, 00:22
Post #1


FLAC Developer


Group: Developer
Posts: 1526
Joined: 27-February 02
Member No.: 1408



it just occurred to me in another thread that all the comparisons of lossless codecs I've seen include only compression ratios and times. but people, even ones who are very particular about ripping, exactness, etc. seem to take for granted that their codec of choice is lossless. are there any large corpus tests to confirm this?

FLAC has a large test suite, but it is designed specifically to find problems with libFLAC. even though it has found problems with MAClib (monkey's audio) and flake I don't think it is suitable as a complete test for other codecs.

(note that the comparison should not rely on the codec's own internal test features like CRC and MD5 checking. for example in the FLAC test suite, the tests are 'round trip', i.e. they encode and decode, then compare orignal and decoded files themselves with other tools.)

Josh
Go to the top of the page
+Quote Post
graue
post Nov 12 2006, 07:09
Post #2





Group: Members
Posts: 22
Joined: 24-June 06
Member No.: 32195



I'm interested in what kinds of things are in the FLAC test suite. Where can I download it? I skimmed various parts of the FLAC website, but all I found was a FAQ answer saying the test suite is "published and comprehensive", but no link.
Go to the top of the page
+Quote Post
jcoalson
post Nov 12 2006, 07:42
Post #3


FLAC Developer


Group: Developer
Posts: 1526
Joined: 27-February 02
Member No.: 1408



it's part of the source code for the project, if you download the source release or check out from CVS, it's in the test/ directory.

Josh
Go to the top of the page
+Quote Post
TBeck
post Nov 12 2006, 12:47
Post #4


TAK Developer


Group: Developer
Posts: 1095
Joined: 1-April 06
Member No.: 29051



We all know that it is impossible to prove that a software of some complexity is error free. Hence what to do to feel a bit more safe?

QUOTE (jcoalson @ Nov 12 2006, 01:22) *
FLAC has a large test suite, but it is designed specifically to find problems with libFLAC. even though it has found problems with MAClib (monkey's audio) and flake I don't think it is suitable as a complete test for other codecs.

I haven't looked at the FLAC test suite, but i assume that it creates specific critical files based upon the knowledge about the internal codec structure? For the experienced developer(s) it should be quite easy to define specific critical conditions for their codecs. Some of them may be useful for many of the other codecs too, for instance many fast changes of the signal characteristics (frequencies, amplitude) which need adaptions of the codec parameters (start a new sub frame, calculate new predictors, calculate new bit coder parameters, handle the transition between the states well). Most of the errors i found in TAK had been caused by such transitions.

Other generally useful extreme conditions could be files with extreme amplitudes (nearly silent, white noise of very high amplitude) or even synthetic sounds like a pure sine or rectangle.

I am quite sure that it would be possible to collect some generally critical files, which are likely to produce errors in different codecs.

On the other hand there will be critical conditions specific to some particular codec, which only the codec developer itself may know.

We could ask the codec developers for critical files and use them to create a test corpus.


QUOTE (jcoalson @ Nov 12 2006, 01:22) *
seem to take for granted that their codec of choice is lossless. are there any large corpus tests to confirm this?


Obviously a bigger test corpus is more likely to create errors.

But to repeat myself, even the best and biggest test corpus can not prove, that a codec is error free.

QUOTE (jcoalson @ Nov 12 2006, 01:22) *
(note that the comparison should not rely on the codec's own internal test features like CRC and MD5 checking. for example in the FLAC test suite, the tests are 'round trip', i.e. they encode and decode, then compare orignal and decoded files themselves with other tools.)


I agree that it is always better to use independend tests which have not to rely on the test object itself. But if you want to know, if your specific files can be decoded well and you don't want to perform a full blown external test (encode, decode, binary compare with external tools), the use of the codecs own verify function seems to be the second best option.

My experience: Until now TAK's internal verify option (immediately decode each frame and compare it to the original wave data) could confirm any codec error which before had been detected by external comparisons. Here i see some advantage for asymmetric codecs: Because of their usually very high decoding speed such a verify function will not significantly affect encoding time.

And to speak against my own interests:

Use a codec with a big user base. If many people have tried it before without problems, you can be a bit more confident, that it works well...

But please give the newcomers a chance too... smile.gif

This post has been edited by TBeck: Nov 12 2006, 12:48
Go to the top of the page
+Quote Post
jcoalson
post Nov 14 2006, 22:47
Post #5


FLAC Developer


Group: Developer
Posts: 1526
Joined: 27-February 02
Member No.: 1408



wow, only 4 replies and 2 are mine?!? shock1.gif

the FLAC suite includes some general streams, like "full-scale deflection" streams that bounce between the rails, and lots of tests on noise which is unpredictable and can violate many kinds of assumptions usually made about the input. I think those are generally useful. some have crashed monkey's audio.

I don't think that relying on the tool's own verification system is enough. for example, that problem I found with flake would not be caught even if it had a self-verification system like flac (and I assume tak's which looks like flac's), because the input samples were corrupted before they even got to the encoder.

Josh
Go to the top of the page
+Quote Post
rjamorim
post Nov 14 2006, 23:23
Post #6


Rarewares admin


Group: Members
Posts: 7515
Joined: 30-September 01
From: Brazil
Member No.: 81



Bryant has somewhat of a test suite for WavPack here:
http://www.rarewares.org/wavpack/test_suite.zip

He can probably give you more information.


I didn't reply because I don't understand squat about lossless compression other than bitching at David for "MOER FEETURZ!". But I do believe this sort of information would be interesting at the Wiki comparison.

This post has been edited by rjamorim: Nov 14 2006, 23:29


--------------------
Get up-to-date binaries of Lame, AAC, Vorbis and much more at RareWares:
http://www.rarewares.org
Go to the top of the page
+Quote Post
AndyH-ha
post Nov 15 2006, 04:01
Post #7





Group: Members
Posts: 2204
Joined: 31-August 05
Member No.: 24222



I suppose this inquiry is about whether or not it is possible to break some of these encoders, not whether or not they really work at all. It is easy to verify for any given audio file by comparing the original with an encoded/decoded version. If you can't find any of these that are not bit identical, it seems not something to worry about unless you just like playing with theoretical questions.
Go to the top of the page
+Quote Post
goodnews
post Nov 15 2006, 04:20
Post #8





Group: Banned
Posts: 232
Joined: 20-January 06
Member No.: 27228



QUOTE (AndyH-ha @ Nov 14 2006, 20:01) *
I suppose this inquiry is about whether or not it is possible to break some of these encoders, not whether or not they really work at all. It is easy to verify for any given audio file by comparing the original with an encoded/decoded version. If you can't find any of these that are not bit identical, it seems not something to worry about unless you just like playing with theoretical questions.

Josh Coalson (who started this thread) is the developer of the popular FLAC lossless audio codec/file format.

He is getting ready soon to release version 1.1.3 (or perhaps 1.2 or 2.0 ) of his FLAC encoder and decoder suite with *lots of new stuff* and I'm sure that is why he is soliciting as many "test case files" as possible to stress-test his new version of the FLAC encoder and decoder. I'm glad to see that Josh hasn't rushed this out without thorough testing, as so many programs use/depend on the FLAC support code that Josh releases freely.

The last updated to FLAC version 1.1.2 was early February 2005, so Josh seems cautious about releases to ensure their quality/stability (unlike some others who update their programs every few months it seems). Keep up the good work Josh, and hopefully all this testing will produce a *great* new version of FLAC. I support changing it to version 1.2 or 2.0 and NOT 1.1.3, as this is not just a "minor point release" in my opinion. So many new features (love Album art support BTW). Whatever you name it, it is likely to be a hit!
Go to the top of the page
+Quote Post
spoon
post Nov 15 2006, 09:43
Post #9


dBpowerAMP developer


Group: Developer (Donating)
Posts: 2741
Joined: 24-March 02
Member No.: 1615



Wouldnt the best test be to feed the encoder with random data, bit depths / channels and repeat in an automated fashion?


--------------------
Spoon http://www.dbpoweramp.com
Go to the top of the page
+Quote Post
Acid8000
post Nov 15 2006, 10:22
Post #10





Group: Members
Posts: 279
Joined: 14-May 05
From: Sydney
Member No.: 22048



QUOTE (spoon @ Nov 15 2006, 18:43) *
Wouldnt the best test be to feed the encoder with random data, bit depths / channels and repeat in an automated fashion?


Do you mean trying to encode white noise? Sure, the compression ratio would be very poor, but at least it could be a somewhat useful test, I think.


--------------------
Acid8000 aka. PhilDEE
Go to the top of the page
+Quote Post
Synthetic Soul
post Nov 15 2006, 11:34
Post #11





Group: Super Moderator
Posts: 4887
Joined: 12-August 04
From: Exeter, UK
Member No.: 16217



QUOTE (jcoalson @ Nov 11 2006, 23:22) *
it just occurred to me in another thread that all the comparisons of lossless codecs I've seen include only compression ratios and times.
...
(note that the comparison should not rely on the codec's own internal test features like CRC and MD5 checking. for example in the FLAC test suite, the tests are 'round trip', i.e. they encode and decode, then compare orignal and decoded files themselves with other tools.)
For my TAK testing my scripts do compare the decoded wave with the source wave using FSUM to compare MD5 hashes. I realise that this will not work in all situations, when codecs removed RIFF chunks, but thankfully for those that retain them this is a quick and easy way to check correctness. Possibly not foolproof, but as good as I can do.

QUOTE (jcoalson @ Nov 14 2006, 21:47) *
wow, only 4 replies and 2 are mine?!? shock1.gif
I don't really know what to say. When I encode using WavPack I immediately verify using WVUNPACK -vm to check whether WavPack's internal routines can see a problem.

As far as more intensive and extensive tests go, I just wouldn't know where to start - this is where I must rely on the technical knowledge and responsibility of the developer, and my peers.

Perhaps all the lossless developers that frequent this board should submit some samples that they know to be troublesome to a corpus, and then other users can test all samples on all participating codecs?

Aside from that you are talking about mass or large-scale testing, that I personaly just don't have the time or inclination for. There's enough paranoia on this board already without this!


--------------------
I'm on a horse.
Go to the top of the page
+Quote Post
smok3
post Nov 15 2006, 14:09
Post #12


A/V Moderator


Group: Moderator
Posts: 1726
Joined: 30-April 02
From: Slovenia
Member No.: 1922



actually it would be nice if every lossless codec would report what happened with all the riff chuncks (after encoding).


--------------------
PANIC: CPU 1: Cache Error (unrecoverable - dcache data) Eframe = 0x90000000208cf3b8
NOTICE - cpu 0 didn't dump TLB, may be hung
Go to the top of the page
+Quote Post
Klyith
post Nov 17 2006, 00:20
Post #13





Group: Members (Donating)
Posts: 352
Joined: 10-July 04
From: Albany NY USA
Member No.: 15259



It should be possible to add a second mode to a lossless encoder such that after every block (or n blocks) of audio is encoded, it immediately decodes the chunk back to test vs the original data. The encoder would need to keep the original data buffered to be compatible with piped input, but that would also help the speed. Call it secure mode.

It would be slower to encode, but I think a lot faster than doing a verify or checksum after the fact. And I'm sure there are plenty of people on the HA board who wouldn't mind the speed tradeoff. The only obstacle is programmer time, and a new feature this big would need a lot of work.
Go to the top of the page
+Quote Post
jcoalson
post Nov 17 2006, 01:42
Post #14


FLAC Developer


Group: Developer
Posts: 1526
Joined: 27-February 02
Member No.: 1408



the -V option to flac does exactly that.
Go to the top of the page
+Quote Post
saratoga
post Nov 17 2006, 02:44
Post #15





Group: Members
Posts: 4844
Joined: 2-September 02
Member No.: 3264



QUOTE (Acid8000 @ Nov 15 2006, 02:22) *
QUOTE (spoon @ Nov 15 2006, 18:43) *

Wouldnt the best test be to feed the encoder with random data, bit depths / channels and repeat in an automated fashion?


Do you mean trying to encode white noise? Sure, the compression ratio would be very poor, but at least it could be a somewhat useful test, I think.


This would actually make the most sense IMO, since you could test the encoder with 10s or even 100s of GB worth of data, but still have the test program be only a few hundred KB which would let everyone run the test, not just people with very fast internet. You could even make the "random" data deterministic so that everyone runs the same sequence of data everytime, thus making the test perfectly repeatible.
Go to the top of the page
+Quote Post
cabbagerat
post Nov 17 2006, 07:05
Post #16





Group: Members
Posts: 1018
Joined: 27-September 03
From: Cape Town
Member No.: 9042



QUOTE (Acid8000 @ Nov 15 2006, 01:22) *
QUOTE (spoon @ Nov 15 2006, 18:43) *

Wouldnt the best test be to feed the encoder with random data, bit depths / channels and repeat in an automated fashion?


Do you mean trying to encode white noise? Sure, the compression ratio would be very poor, but at least it could be a somewhat useful test, I think.
I think a number of test samples consisting of random data would be useful. Tests with noise biased to both high and low frequencies would be interesting, especially high frequency noise as (I guess) this would defeat the prediction fairly efectively. Something like this (in MATLAB or Octave):
CODE
x=rand(1,1000)*2-1;
b=fir2(32, [0 0.5 0.7 1], [0.2 0.2 0.9 1]); %Design a FIR filter which rejects low frequencies somewhat
y=filtfilt(b, 1, x);

A full set of tests on random data would not prove that FLAC is correct, but they would be useful evidence. Tests could include white, pink and blue noise as well as noise with an unusual distribution - like a Rayleigh distribution.


--------------------
Simulate your radar: http://www.brooker.co.za/fers/
Go to the top of the page
+Quote Post
jcoalson
post Nov 17 2006, 08:43
Post #17


FLAC Developer


Group: Developer
Posts: 1526
Joined: 27-February 02
Member No.: 1408



noise has its place in the tests for sure. I think a wide selection of 'normal' audio from a big corpus would also help; imagine if a multiply-accumulate path for a filter would only overflow when excited by a certain kind of signal. flac will often switch to verbatim frames and not even hit the common datapaths for noise, which is why my tests also include non-noise samples.

actually what motivated the topic in the first place was my original point that the widespread use of a codec could count as anecdotal evidence that it was lossless, except for the puzzling fact that people (who are normally particular about this) apparently are not checking that their codec is lossless, and this also seems true for the various comparisons. so the anecdotal record would miss non-lossless problems that were not audible.

Josh
Go to the top of the page
+Quote Post
greynol
post Nov 17 2006, 09:10
Post #18





Group: Super Moderator
Posts: 10000
Joined: 1-April 04
From: San Francisco
Member No.: 13167



FWIW, I verify each and every Monkey's Audio file that I encode by comparing an md5 checksum of the raw decoded data against one generated from the raw data used to create it.

I have yet to have a problem <knock on wood> but will be sure to report it here if I do.

This post has been edited by greynol: Nov 17 2006, 09:11


--------------------
YOUR EYES CANNOT HEAR!!!!!!!!!!!
Go to the top of the page
+Quote Post
beto
post Nov 17 2006, 15:04
Post #19





Group: Members (Donating)
Posts: 713
Joined: 8-July 04
From: Sao Paulo
Member No.: 15173



QUOTE (greynol @ Nov 17 2006, 06:10) *
FWIW, I verify each and every Monkey's Audio file that I encode by comparing an md5 checksum of the raw decoded data against one generated from the raw data used to create it.

I have yet to have a problem <knock on wood> but will be sure to report it here if I do.


And how do you do that? Which tools and what is the process you use? I'm interested in this but for wavpack, maybe you can give me some ideas. smile.gif


--------------------
http://volutabro.blogspot.com
Go to the top of the page
+Quote Post
greynol
post Nov 17 2006, 19:17
Post #20





Group: Super Moderator
Posts: 10000
Joined: 1-April 04
From: San Francisco
Member No.: 13167



QUOTE (beto @ Nov 17 2006, 06:04) *
And how do you do that? Which tools and what is the process you use? I'm interested in this but for wavpack, maybe you can give me some ideas. smile.gif

With md5sum.exe (the original one, not the one from etree!) in combination with the rarewares version of MAC. I could not get the non-etree build of md5sum to pipe its output properly.

In the case of transcoding from flac, I use md5sum.exe in combination with Sox and the rarewares version of MAC, but Sox does not work right if the track does not end on a frame boundary, though I think of this as feature rather than a flaw.

It would be simpler to just use shntool with the rarewares version of MAC which should also not have trouble with tracks that don't end on frame boundaries but this method is slightly slower.

http://www.hydrogenaudio.org/forums/index....st&p=447547

PS: I'm still looking for a command line CRC32 generator that handles piping and redirection as I cannot get fsum to do this. This would allow me to check tracks against CRC information from EAC which calculated from the raw PCM data.

This post has been edited by greynol: Nov 17 2006, 19:18


--------------------
YOUR EYES CANNOT HEAR!!!!!!!!!!!
Go to the top of the page
+Quote Post
beto
post Nov 17 2006, 19:49
Post #21





Group: Members (Donating)
Posts: 713
Joined: 8-July 04
From: Sao Paulo
Member No.: 15173



Thank you. That surely gave me some ideas. smile.gif


--------------------
http://volutabro.blogspot.com
Go to the top of the page
+Quote Post
EuMesmo
post Nov 19 2006, 23:03
Post #22





Group: Members
Posts: 21
Joined: 21-October 06
Member No.: 36610



It's funny to hear the question from you, as you are essentially asking "Is lossless realy without loss?"

A few years ago I searched on this forum for this question, and did not find the answer. I was looking for a lossless codec for archive, and had my doubts (as you said, people are picky about this tings). I devised the following test:

1-I ripped a track to aiff.
2-Encoded to flac.
3-Encoded the aiff to codecX (sometimes after converting from aiff to wav).
4-Trancoded from codecX with DMC to flac.
5-Decoded from codecX to wav-2
6-Enconded the wav-2 to flac
7-Compared the md5 of the audio data in the flac files.

I assumed flac was lossless, and it had an md5 to check just the audio data, which was what I was looking for. I got the same md5 from ape, shn, wv, rkau. The data was different from ra lossless and an old incarnation of wma lossless (it was 9.0. 9.1 got the same md5). I recently did the same test with apple lossless and ogg-flac, and once toyed with this test on my burned audio cds.

I am assuming that flac is completely lossless, and the loss could be on the encoding or on the decoding, or some part of the wav header been stored on the file (which woldn't suit me anyway). I know other problems with this test, and that it has a lot of limitations. But I found strange that the wiki didn't mention that ra lossless was not "completely lossless". It may have changed, however.

QUOTE (greynol @ Nov 17 2006, 05:10) *
FWIW, I verify each and every Monkey's Audio file that I encode by comparing an md5 checksum of the raw decoded data against one generated from the raw data used to create it.

I have yet to have a problem <knock on wood> but will be sure to report it here if I do.


This is what I was looking to do, but didn't know how to do. I understood what you said, but I am not sure I can replicate it. Could you provide a bat file or an application to do it?

And after your statement I'll assume mac IS lossless.
Go to the top of the page
+Quote Post
bhoar
post Nov 20 2006, 01:02
Post #23





Group: Members (Donating)
Posts: 612
Joined: 31-May 06
Member No.: 31326



From what I have read, there are several reasons a WAV file might contain different data, yes still contain bit-identical PCM audio. WAV is a file format for containing several types of data organized in several ways.

I'd be curious if you compared the two files to see what the difference is. If it is only in the first 36 bytes, then it is just a RIFF/WAVE header issue (e.g. there are two unused bytes in the header if the data is a typical WAV, perhaps they differ?) and the audio (the PCM data itself) has not changed.

If it is later in the file, then there might be an issue with the lossless codec possibly not really being lossless, and that should be investigated. Alternately, there might be a different way of chunking that still gives the same audio data, which would still be lossless.

Also double check the conversion chain to ensure you aren't inadvertantly turning on sample size/rate conversion of any type.

-brendan


--------------------
Hacking CD Robots & Autoloaders: http://hyperdiscs.pbwiki.com/
Go to the top of the page
+Quote Post
Jan S.
post Nov 20 2006, 20:39
Post #24





Group: Admin
Posts: 2549
Joined: 26-September 01
From: Denmark
Member No.: 21



I discussed some of the problems last night with Roberto and we came up with some viewpoints on this.
There seems to be two theoretical ways to go about this:

1. You mathematically analyse the algoritms and mathematically work out if it will be lossless for all possible samples.
The problem with this is however that the encoder/decoder is not a closed system thus you cannot possibly account for external variables like FPU and CPU. So eventhough you algoritms might be perfect you cannot be sure your output will be. Hence this type of test will be pointless if the goal is absolute perfection.

2. You run as much data thru the codec to establish a decent confidence level (if you generate random (but non-repeated blocks) thru the codec couldn't this actually be calculated?).
This should be a quit easy task if the author provides a way to do this automatically.

Then again isn't all of this a non-issue if people just use the -V switch?
Go to the top of the page
+Quote Post
greynol
post Nov 20 2006, 20:57
Post #25





Group: Super Moderator
Posts: 10000
Joined: 1-April 04
From: San Francisco
Member No.: 13167



QUOTE (Jan S. @ Nov 20 2006, 11:39) *
Then again isn't all of this a non-issue if people just use the -V switch?
Certainly for flac, but what about other codecs?


--------------------
YOUR EYES CANNOT HEAR!!!!!!!!!!!
Go to the top of the page
+Quote Post

2 Pages V   1 2 >
Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 



RSS Lo-Fi Version Time is now: 23rd July 2014 - 05:04