IPB

Welcome Guest ( Log In | Register )

4 Pages V  < 1 2 3 4 >  
Reply to this topicStart new topic
Lossless Compression Test, FLAC, WP, MA, TAKC, TTA, LA, OF, SHN, WMA
Nessuno
post Oct 5 2012, 12:32
Post #26





Group: Members
Posts: 422
Joined: 16-December 10
From: Palermo
Member No.: 86562



QUOTE (Porcus @ Oct 5 2012, 09:52) *
I suppose that in practice, users just want their music collection to fit on your old 2TB drive rather than having to buy a new 3TB drive, right?

Let's not forget the plus one for the pot... hem for the backup: I always sum twice (at least) the space when think about my laburiously ripped lossless collection and still cloud storage services aren't a feasible alternative to doubling physical disks.

QUOTE
(By the way, I use a 2nd lossless to distinguish those CDs which came with pre-emphasis. As I don't convert them, merely tag them (and decode on-the-fly), I want to be insured against accidentally deleting tags.)

A little obsessive on the matter? Me getting too... wink.gif wink.gif


--------------------
... I live by long distance.
Go to the top of the page
+Quote Post
eahm
post Oct 5 2012, 16:14
Post #27





Group: Members
Posts: 1028
Joined: 11-February 12
Member No.: 97076



QUOTE (probedb @ Oct 4 2012, 03:08) *
I'd agree with skamp. There's a saving of 10GB between FLAC -8 and LA-HIGH which is nothing. I think it comes down to whether you're using older hardware or not as encoding speed even at these levels is a fraction of ripping time too.

I'll stick with FLAC smile.gif

Actually the difference is 9GB and between LA -HIGH and FLAC -5 and 8.33GB between LA -HIGH and FLAC -8. Considering the speed between -8 and -5 I'll stick with -5.

Apple did a good thing with ALAC, it didn't create confusion. They compared different algorithms, speeds and codecs and left only one for the user. I am sure whoever uses ALAC got over the headache of which "compression" to use, just to save few MBs, the first day.

This post has been edited by eahm: Oct 5 2012, 16:19
Go to the top of the page
+Quote Post
Porcus
post Oct 5 2012, 17:25
Post #28





Group: Members
Posts: 1842
Joined: 30-November 06
Member No.: 38207



QUOTE (Destroid @ Oct 5 2012, 13:05) *
QUOTE
I suppose that in practice, users just want their music collection to fit on your old 2TB drive rather than having to buy a new 3TB drive, right? (And all the other features of a modern lossless, like tagging capabilities.)

Yikes! A compression increase of 33% above the average of these codecs' results?


Precisely. There's a “what's your compression ratio?” thread at http://www.hydrogenaudio.org/forums/index....showtopic=97125 . My FLACs clock in at slightly above 900.
While space certainly was more of an issue at the time when you had to build a computer which had enough space and mobo for the drives (anyone who needs an IDE card or two?), it kinda still is: even if I were to buy drives now, I would save me a couple of hundred dollars from audio compression, plus the video part of the DVDs.

This post has been edited by Porcus: Oct 5 2012, 17:31


--------------------
One day in the Year of the Fox came a time remembered well
Go to the top of the page
+Quote Post
_mē_
post Oct 5 2012, 18:19
Post #29





Group: Members
Posts: 231
Joined: 6-April 09
Member No.: 68706



QUOTE (Dario @ Oct 4 2012, 21:50) *
QUOTE (_mē_ @ Oct 4 2012, 21:44) *
WavPack -hhx6

Out of curiosity, what encoding speed do you make of -hhx6?

Frankly, I don't know, it's so fast that I don't care.
Though it should be noted that I used strong ofr settings as my default before.

This post has been edited by _mē_: Oct 5 2012, 18:19
Go to the top of the page
+Quote Post
yourlord
post Oct 5 2012, 21:18
Post #30





Group: Members
Posts: 196
Joined: 1-March 11
Member No.: 88621



QUOTE (eahm @ Oct 5 2012, 11:14) *
Considering the speed between -8 and -5 I'll stick with -5.


I actually use flac -8 because even at that setting encoding is still so much faster than my cd drive can read that it doesn't matter. Also, encoding is something I only do one time so even IF it made a difference (within reason) I'd still use -8. Decoding speed and efficiency are the big issues.
Go to the top of the page
+Quote Post
A_Man_Eating_Duc...
post Oct 5 2012, 21:41
Post #31





Group: Members
Posts: 925
Joined: 21-December 01
From: New Zealand
Member No.: 705



The TAK -p4m run is done.
CODE
Codec          GB        Compression
TAKC -P4M    189.49        64.59%

Now on to OtimFrog and WavPack.


--------------------
Who are you and how did you get in here ?
I'm a locksmith, I'm a locksmith.
Go to the top of the page
+Quote Post
foomark
post Oct 6 2012, 11:39
Post #32





Group: Members
Posts: 87
Joined: 12-July 11
Member No.: 92213



May i ask you the correct parameters to use the .la high conversion??
Go to the top of the page
+Quote Post
jensend
post Oct 6 2012, 15:14
Post #33





Group: Members
Posts: 143
Joined: 21-May 05
Member No.: 22191



Lots of people around here seem to be interested in turning on all the slowest brute-force options. But you're not using anywhere near enough brute force here! Clearly, since you aren't looking at encode or decode time, the right thing to do is run a program that, using an enumeration of all Turing machines, simulates more and more of them for increasing periods of time, checking whether the TM halted with the entirety of your music collection in its output. The compressed format of your music is then just a description of the shortest TM you discover that does that.*

Since a significant amount of data in your lossless files is below the instantaneous noise floor and therefore effectively random, doing this may still not give you tremendous savings over FLAC. (If FLAC compresses to 50% and 1/4 of the bits of the original are effectively random noise, it's entirely impossible for any lossless compressor, no matter how miraculous, to beat FLAC by a factor of 2; this is kinda like Amdahl's Law, with "incompressible part" taking the place of "nonparallelizable part.")

Also, of course, decode time might be rather long and encode time is likely to be many times longer than the lifespan of the universe. But hey, you saved a few gigs! Surely that's worth it!

*See wikipedia's article on Kolmogorov complexity. Note that even then you are unlikely to be able to guarantee you've come up with the shortest description, since some shorter TMs which seem like they're not going to halt could eventually output your music collection and halt. To prove you've got the shortest you would have to prove that all shorter programs either halt without reproducing your music collection or never halt, and the halting problem is in general uncomputable.
Go to the top of the page
+Quote Post
C.R.Helmrich
post Oct 6 2012, 16:09
Post #34





Group: Developer
Posts: 686
Joined: 6-December 08
From: Erlangen Germany
Member No.: 64012



QUOTE (jensend @ Oct 6 2012, 16:14) *
Lots of people around here seem to be interested in turning on all the slowest brute-force options. But you're not using anywhere near enough brute force here! Clearly, since you aren't looking at encode or decode time ...

blink.gif I see only one contributor to this thread to whom your above statements apply. I, for example, don't consider FLAC -8 a brute-force option - if it were, it would try to find e.g. the optimal LPC order via brute-force... which it doesn't. And I completely agree with yourlord.

Chris


--------------------
If I don't reply to your reply, it means I agree with you.
Go to the top of the page
+Quote Post
Porcus
post Oct 6 2012, 19:01
Post #35





Group: Members
Posts: 1842
Joined: 30-November 06
Member No.: 38207



QUOTE (jensend @ Oct 6 2012, 16:14) *
But you're not using anywhere near enough brute force here!


Obviously not near in terms of result either. If we were, then we could decode a 48 kb/s lossy to PCM, and then use the lossless to recompress it to 48 or less.


--------------------
One day in the Year of the Fox came a time remembered well
Go to the top of the page
+Quote Post
_mē_
post Oct 6 2012, 19:54
Post #36





Group: Members
Posts: 231
Joined: 6-April 09
Member No.: 68706



QUOTE (jensend @ Oct 6 2012, 16:14) *
Since a significant amount of data in your lossless files is below the instantaneous noise floor and therefore effectively random, doing this may still not give you tremendous savings over FLAC. (If FLAC compresses to 50% and 1/4 of the bits of the original are effectively random noise, it's entirely impossible for any lossless compressor, no matter how miraculous, to beat FLAC by a factor of 2; this is kinda like Amdahl's Law, with "incompressible part" taking the place of "nonparallelizable part.")

How do you measure the noise floor?
Because I suspect you don't and if you don't - the quote above is rubbish.

QUOTE (jensend @ Oct 6 2012, 16:14) *
Lots of people around here seem to be interested in turning on all the slowest brute-force options.

I partially agree with this...partially because testing codec speed is something that anybody can do easily. It's imperfect, but much faster than large scale tests.

2 A_Man_Eating_Duck:
Could you please update the first post?
Not necessarily after each new codec tested, but I guess many people won't look farther than that.

This post has been edited by _mē_: Oct 6 2012, 19:59
Go to the top of the page
+Quote Post
Porcus
post Oct 6 2012, 22:02
Post #37





Group: Members
Posts: 1842
Joined: 30-November 06
Member No.: 38207



QUOTE (_mē_ @ Oct 6 2012, 20:54) *
2 A_Man_Eating_Duck:
Could you please update the first post?


After an hour, only moderators can edit. So I guess it can wait for the other updates.


--------------------
One day in the Year of the Fox came a time remembered well
Go to the top of the page
+Quote Post
A_Man_Eating_Duc...
post Oct 7 2012, 00:10
Post #38





Group: Members
Posts: 925
Joined: 21-December 01
From: New Zealand
Member No.: 705



QUOTE (foomark @ Oct 6 2012, 22:39) *
May i ask you the correct parameters to use the .la high conversion??
i used this
CODE
LA.exe -high "%inputfile%" "%outputfile%"



QUOTE (_mē_ @ Oct 7 2012, 06:54) *
2 A_Man_Eating_Duck:
Could you please update the first post?
Not necessarily after each new codec tested, but I guess many people won't look farther than that.
I'm going to wait for all the other encodes to finish, make some new graphs and ask a mod to update the original post.

This post has been edited by A_Man_Eating_Duck: Oct 7 2012, 00:15


--------------------
Who are you and how did you get in here ?
I'm a locksmith, I'm a locksmith.
Go to the top of the page
+Quote Post
06_taro
post Oct 7 2012, 02:20
Post #39





Group: Members
Posts: 12
Joined: 22-September 10
Member No.: 84054



Personally I don't care too much if encoding time gets longer for insane settings, as long as the speed is still acceptable, like flac's -8 or takc's -p4m. But if it gets too crazy like WavPack's -hh -x6, or the decoding speed reduced to a too costly rate, like falling from 15x to 3x, I would like to avoid those settings.
Go to the top of the page
+Quote Post
foomark
post Oct 7 2012, 09:36
Post #40





Group: Members
Posts: 87
Joined: 12-July 11
Member No.: 92213



QUOTE (A_Man_Eating_Duck @ Oct 7 2012, 01:10) *
QUOTE (foomark @ Oct 6 2012, 22:39) *
May i ask you the correct parameters to use the .la high conversion??
i used this
CODE
LA.exe -high "%inputfile%" "%outputfile%"



This works if i convert a file with the command line (cmd.exe), i was looking to the foobar2000 version of the "code" wink.gif



PS: i just realized i can't even play .la files in my foobar2000!!!
Where should i take the right input .la component? smile.gif

This post has been edited by foomark: Oct 7 2012, 09:38
Go to the top of the page
+Quote Post
Rollin
post Oct 7 2012, 10:14
Post #41





Group: Members
Posts: 189
Joined: 5-March 08
Member No.: 51815



QUOTE (foomark @ Oct 7 2012, 12:36) *
i was looking to the foobar2000 version of the "code" wink.gif
PS: i just realized i can't even play .la files in my foobar2000!!!
Where should i take the right input .la component? smile.gif



Foobar decoder: http://www.mediafire.com/download.php?f2bk6qitzu47qbd
But it is buggy
Go to the top of the page
+Quote Post
foomark
post Oct 7 2012, 10:30
Post #42





Group: Members
Posts: 87
Joined: 12-July 11
Member No.: 92213



QUOTE (Rollin @ Oct 7 2012, 11:14) *
QUOTE (foomark @ Oct 7 2012, 12:36) *
i was looking to the foobar2000 version of the "code" wink.gif
PS: i just realized i can't even play .la files in my foobar2000!!!
Where should i take the right input .la component? smile.gif



Foobar decoder: http://www.mediafire.com/download.php?f2bk6qitzu47qbd
But it is buggy


Thank you!! The conversion works good!!
The playback of a .la instead made my foobar2000 crash. really buggy!
Go to the top of the page
+Quote Post
2012
post Oct 7 2012, 15:36
Post #43





Group: Members
Posts: 64
Joined: 7-February 12
Member No.: 96993



On the topic of insane settings, I found flake -12 (r264 ,SVN version) to be the best CPU-based flac encoder I came across compression wise.

I wouldn't expect the size reduction to be relevant compared to -8. But in such a big collection, and with very small differences between various codecs, who knows?

This post has been edited by 2012: Oct 7 2012, 15:38
Go to the top of the page
+Quote Post
IgorC
post Oct 7 2012, 20:09
Post #44





Group: Members
Posts: 1532
Joined: 3-January 05
From: ARG/RUS
Member No.: 18803



Interesting comparison. smile.gif

During 2003 the prices of DVD drives and discs have gone down. People have moved to FLAC from Monkey Audio. Before that people cared about each MB when CD had only 700 MB.
http://imageshack.us/photo/my-images/206/ha1fg1.png/

Since 2003 FLAC is the most popular lossless format.

2009 ripping/encoding general poll
2012 ripping/encoding general poll

Fast decoding speed is appreciated much more than 4-5% of better compression.

This post has been edited by IgorC: Oct 7 2012, 20:17
Go to the top of the page
+Quote Post
A_Man_Eating_Duc...
post Oct 7 2012, 20:46
Post #45





Group: Members
Posts: 925
Joined: 21-December 01
From: New Zealand
Member No.: 705



QUOTE (2012 @ Oct 8 2012, 02:36) *
On the topic of insane settings, I found flake -12 (r264 ,SVN version) to be the best CPU-based flac encoder I came across compression wise.

I wouldn't expect the size reduction to be relevant compared to -8. But in such a big collection, and with very small differences between various codecs, who knows?
Damn you, i forgot about Flake, i'll add -9 + to the results as well smile.gif

This post has been edited by A_Man_Eating_Duck: Oct 7 2012, 20:46


--------------------
Who are you and how did you get in here ?
I'm a locksmith, I'm a locksmith.
Go to the top of the page
+Quote Post
lvqcl
post Oct 7 2012, 20:52
Post #46





Group: Developer
Posts: 3325
Joined: 2-December 07
Member No.: 49183



http://www.cuetools.net/wiki/CUETools_FLAC...ders_comparison
QUOTE
libFlake and FlaCuda are tuned differently, so libFlake -5 might in fact compress better than libFLAC -8. They also support additional compression levels 9-11, however their use is not recommended, because those levels produce so called non-subset files, which might not be supported by certain e.g. hardware implementations.
Go to the top of the page
+Quote Post
A_Man_Eating_Duc...
post Oct 7 2012, 20:59
Post #47





Group: Members
Posts: 925
Joined: 21-December 01
From: New Zealand
Member No.: 705



double damn, ok i'll put all of them in for flake smile.gif


--------------------
Who are you and how did you get in here ?
I'm a locksmith, I'm a locksmith.
Go to the top of the page
+Quote Post
IgorC
post Oct 7 2012, 21:27
Post #48





Group: Members
Posts: 1532
Joined: 3-January 05
From: ARG/RUS
Member No.: 18803



FLAC also has some addtional parameters those help a little bit.

Like: -8 -A tukey(0.5) -A flattop
Like here http://www.synthetic-soul.co.uk/comparison...sion&Desc=0

Maybe compression vs decoding speed is not less interesting (if not more) than pure compression comparison.
Go to the top of the page
+Quote Post
Destroid
post Oct 8 2012, 01:59
Post #49





Group: Members
Posts: 545
Joined: 4-June 02
Member No.: 2220



QUOTE (IgorC @ Oct 7 2012, 20:09) *
During 2003 the prices of DVD drives and discs have gone down. People have moved to FLAC from Monkey Audio.
...
Fast decoding speed is appreciated much more than 4-5% of better compression.

Just wanted to mention my experience with lossless files on disc usually the max speed of the CD/DVD drive is the limiting factor. To clarify, when converting files made with TAK -p4m from CD/DVD to LAME MP3 the CPU is not hitting 100%.

I agree that I would spend the extra time spent on the encode side for some savings as long as decode speed is not terribly affected, but I doubt I would notice any decoding speed difference between TAK -p4m and FLAC -0 with files on a disc.


--------------------
"Something bothering you, Mister Spock?"
Go to the top of the page
+Quote Post
Polar
post Oct 8 2012, 16:23
Post #50





Group: Members
Posts: 266
Joined: 12-February 04
Member No.: 11970



QUOTE (A_Man_Eating_Duck @ Oct 7 2012) *
QUOTE (foomark @ Oct 6 2012) *
May i ask you the correct parameters to use the .la high conversion??
i used this
CODE
LA.exe -high "%inputfile%" "%outputfile%"
Although if you're into purism, -high -noseek will squeeze out another few bytes by sacrificing seekability. Which for .la files is worth doing, since the format, if anything, is good for archiving, without the immediate need for playability (i.e. fast decoding). The latter is what you'd be using FLAC, TAK, ALAC of WavPack for.
QUOTE (La 0.4b documentation)
Command-line
la [flags] input-filename(s) [output-filename(s)]
Flags
-high - high compression mode - slower, but better compression
(snip)
-noseek - disable seeking (improves compression slightly)


This post has been edited by Polar: Oct 8 2012, 16:33
Go to the top of the page
+Quote Post

4 Pages V  < 1 2 3 4 >
Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 



RSS Lo-Fi Version Time is now: 22nd July 2014 - 19:10