IPB

Welcome Guest ( Log In | Register )

> Hydrogenaudio Forum Rules

- No Warez. This includes warez links, cracks and/or requests for help in getting illegal software or copyrighted music tracks!


- No Spamming or Trolling on the boards, this includes useless posts, trying to only increase post count or trying to deliberately create a flame war.


- No Hateful or Disrespectful posts. This includes: bashing, name-calling or insults directed at a board member.


- Click here for complete Hydrogenaudio Terms of Service

5 Pages V  < 1 2 3 4 5 >  
Closed TopicStart new topic
What's your lossless codec of choice?
What's your lossless codec of choice?
What's your lossless codec of choice?
Apple Lossless [ 36 ] ** [5.03%]
FLAC [ 377 ] ** [52.73%]
La [ 4 ] ** [0.56%]
Monkey's Audio [ 130 ] ** [18.18%]
OptimFROG [ 7 ] ** [0.98%]
Shorten [ 0 ] ** [0.00%]
TTA [ 8 ] ** [1.12%]
WavPack [ 106 ] ** [14.83%]
WMA Lossless [ 14 ] ** [1.96%]
other (please specify)/I'm not into lossless at all [ 33 ] ** [4.62%]
Total Votes: 950
  
Polar
post Aug 13 2004, 15:49
Post #51





Group: Members
Posts: 266
Joined: 12-February 04
Member No.: 11970



QUOTE (jcoalson @ Aug 10 2004, 15:43 UTC)
QUOTE (Polar @ Aug 10 2004, 10:15 UTC)
With FLAC retaining the top rank and even reinforcing its lead in popularity and use over the other codecs, getting over half of the votes in both polls and some 2/3 of all posts in this particular forum, perhaps the time has come to consider creating a separate FLAC subforum, along with or next to this one...
that would help me keep up on the FLAC topics for sure! but it's cool either way, I do the occasional shotgun search to try and keep up.
*
Replied to that here.

QUOTE (jcoalson @ Aug 10 2004, 15:43 UTC)
it would be nice in the next poll to have 'other' and 'don't use lossless' split so we could normalize the percentages against actual lossless users, or better yet have no 'don't use' entry all since that could be covered better by a different poll.
*
You're right. There hasn't been much worthy of reading that's come out of that option. Next time, it could be interesting to include WavPack hybrid and OptimFROG DualStream. But then we'd have to leave something out, since there's still that 10 answers' maximum.
Go to the top of the page
+Quote Post
rjamorim
post Aug 13 2004, 16:12
Post #52


Rarewares admin


Group: Members
Posts: 7515
Joined: 30-September 01
From: Brazil
Member No.: 81



QUOTE (Polar @ Aug 13 2004, 04:18 AM)
I object. I've never hinted at that anywhere. Besides, I'm convinced that any prediction, in matters where so many factors are into play, is entirely senseless.
*


You didn't hint. But you said Apple Lossless is an "obvious winner" because it took the "No 4 spot with 5%, coming from scratch in less than 4 months".

Well, you say it's an obvious winner, and you say it's rising fast. I say that interpretation I did of what you wrote is pretty much possible.


And I still say you are reading too much into this poll.


--------------------
Get up-to-date binaries of Lame, AAC, Vorbis and much more at RareWares:
http://www.rarewares.org
Go to the top of the page
+Quote Post
RockFan
post Aug 13 2004, 16:15
Post #53





Group: Members
Posts: 292
Joined: 20-March 04
Member No.: 12866



I use (and voted for) Monkey's, for the simple reason that it's built into Plextools as an ripping option.

I use Plextools because it allows control of drive-speed - I have no need to rip entire CDs in 2 or 3 minutes so I use a lower than max CAV (Constant Angular Velocity) setting of 7-16x.

The theory is that this lower speed is easier on the drive, and less error-prone.

Someone mentioned in an earlier post that they'd had probs with ASIO ouput using APE - gapless not working properly, I havn't experienced this with Foobar. However I use kernel-streaming now anyway (by far the best for SPDIF output in my experience),

Rainer.

(edit for typos)

This post has been edited by RockFan: Aug 13 2004, 16:17
Go to the top of the page
+Quote Post
Omion
post Aug 13 2004, 19:06
Post #54





Group: Developer
Posts: 432
Joined: 22-February 04
From: San Diego, CA
Member No.: 12180



QUOTE (Polar @ Aug 13 2004, 01:13 AM)
Edit:
On second thought, the fact that, especially over a 12 times' decoding average, your -8 and -2 encodings gave such deviant results (well, one's gotta argue about something, right? wink.gif), might be attributed to the limited 8 song base. Hans van der Heijden's FLACed some 80 songs for his test, and each of the 8 compression levels he tested (including -8 and -2, but no --super-secret and -0) decoded at an average 51x real-time speed on his 900 MHz Athlon:

*

I don't think the -8 was really a 'deviant result. I think the results (other than -2) indicate that decoding speed is related to the --max-lpc-order switch (also called -l). My results sort of show five decoding speed "zones": 0,1; 3; 4,5,6,7; 8; SS. (again, forgetting about -2). These zones correspond directly to where the -l switch changes.
-0= -l 0 -b 1152 -r 2,2
-1= -l 0 -b 1152 -M -r 2,2
-2= -l 0 -b 1152 -m -r 3
-3= -l 6 -b 4608 -r 3,3
-4= -l 8 -b 4608 -M -r 3,3
-5= -l 8 -b 4608 -m -r 3,3
-6= -l 8 -b 4608 -m -r 4
-7= -l 8 -b 4608 -m -e -r 6
-8= -l 12 -b 4608 -m -e -r 6
-SS= --lax -P 4096 -b 4608 -m -l 32 -e -E -p -q 0 -r 0,16

I'm pretty sure Josh Coalson said something about what decoding speed depends on, but I can't find it now.

Keep in mind, though, that the difference in decoding speed from -0 to -8 is only 6X, or 10% slower for -8. I don't think there's any harm in encoding with -8, as it's still a lot faster than any other popular lossless codec (according to the chart you posted).


--------------------
"We demand rigidly defined areas of doubt and uncertainty!" - Vroomfondel, H2G2
Go to the top of the page
+Quote Post
jcoalson
post Aug 13 2004, 21:21
Post #55


FLAC Developer


Group: Developer
Posts: 1526
Joined: 27-February 02
Member No.: 1408



that is correct. all input being equal, the only significant variable in decode time is the LPC order, and when LPC was not used, the polynomial order (which is not a controllable parameter of the reference encoder).

but still this variability is small. I think these kinds of graphs should always be accompanied by a version that shows the axes at full scale, to put things is the proper perspective. you will find the points much closer together.

Josh
Go to the top of the page
+Quote Post
Omion
post Aug 13 2004, 21:38
Post #56





Group: Developer
Posts: 432
Joined: 22-February 04
From: San Diego, CA
Member No.: 12180



QUOTE (jcoalson @ Aug 13 2004, 01:21 PM)
but still this variability is small.  I think these kinds of graphs should always be accompanied by a version that shows the axes at full scale, to put things is the proper perspective.  you will find the points much closer together.
*

Here's a full version. The horizontal axis goes to 1 (=orig file size), and the crosshairs are on flac -0.

They're all about the same, except --ss.

Just to clarify, I'm not trying to convince anybody that the decoding speed changes significantly, only that it does change.


--------------------
"We demand rigidly defined areas of doubt and uncertainty!" - Vroomfondel, H2G2
Go to the top of the page
+Quote Post
Polar
post Aug 14 2004, 10:21
Post #57





Group: Members
Posts: 266
Joined: 12-February 04
Member No.: 11970



QUOTE (Omion @ Aug 13 2004, 18:06 UTC)
I don't think the -8 was really a 'deviant result. I think the results (other than -2) indicate that decoding speed is related to the --max-lpc-order switch (also called -l). My results sort of show five decoding speed "zones": 0,1; 3; 4,5,6,7; 8; SS. (again, forgetting about -2). These zones correspond directly to where the -l switch changes.
*
QUOTE (jcoalson @ Aug 13 2004, 20:21 UTC)
that is correct.  all input being equal, the only significant variable in decode time is the LPC order, and when LPC was not used, the polynomial order (which is not a controllable parameter of the reference encoder).
*
Very interesting, Omion and Josh. Thanks for that.
Go to the top of the page
+Quote Post
Polar
post Aug 14 2004, 23:06
Post #58





Group: Members
Posts: 266
Joined: 12-February 04
Member No.: 11970



Still curious to know, Josh: any idea as to why Omion's encodings at level -2 decoded on average slightly faster than -1 and -0?
Go to the top of the page
+Quote Post
Omion
post Aug 14 2004, 23:31
Post #59





Group: Developer
Posts: 432
Joined: 22-February 04
From: San Diego, CA
Member No.: 12180



I'd like to know, too.

BTW, I'm doing another FLAC test right now on a bunch of my audio (probably ~20 albums, or until I get bored), but I don't have enough space to do them all at once so it could take a while. We'll see if the -2 oddity remains...


--------------------
"We demand rigidly defined areas of doubt and uncertainty!" - Vroomfondel, H2G2
Go to the top of the page
+Quote Post
Polar
post Aug 14 2004, 23:41
Post #60





Group: Members
Posts: 266
Joined: 12-February 04
Member No.: 11970



Are you taking hard disk fragmentation into account, like Hans van der Heijden's mentioned on his comparison site?
QUOTE (http://web.inter.nl.net/users/hvdh/lossless/lossless.htm)
To have the harddisk's performance as constant as possible, I kept it defragged before running each batchfile, and erased the generated lossless- and wav files afterwards, so the files got placed in about the same place on the harddisk.

Apart from that: w00t.gif Kewl!
Keep us in touch!
Go to the top of the page
+Quote Post
Omion
post Aug 15 2004, 02:02
Post #61





Group: Developer
Posts: 432
Joined: 22-February 04
From: San Diego, CA
Member No.: 12180



QUOTE (Polar @ Aug 14 2004, 03:41 PM)
Are you taking hard disk fragmentation into account, like Hans van der Heijden's mentioned on his comparison site?
*

I actually just thought of that after I posted. I realized that I didn't do that for the last test, which could account for the discrepancy. I encoded all the samples at a particular level at one time, so all the -2 files could have been fragmented more than the others.

Fear not, however! The test that I'm doing right now (just started, really) will have a freshly defragged drive at the beginning of each session.

[edit] And I just figured out how to do confidence intervals in Mathematica! Rejoyce! w00t.gif

[edit2] Hmm. Weird things are happening with the decoding. I did another 12 tests on an album (~45 minutes) and got the following results:

x is test number, y is decompression speed. The graph on the top is -0, bottom is -ss, the others are in between. All compression settings except for -0, -1, -2 seem to have two 'states.' They start off in the slower state, then eventually get to the higher one. blink.gif

[edit3] Well, that problem disappeared as quickly as it came. The test is underway...

This post has been edited by Omion: Aug 16 2004, 07:43


--------------------
"We demand rigidly defined areas of doubt and uncertainty!" - Vroomfondel, H2G2
Go to the top of the page
+Quote Post
prankstare
post Aug 15 2004, 12:58
Post #62





Group: Members
Posts: 94
Joined: 13-July 03
From: Brazil
Member No.: 7733



A bit in doubt between FLAC and Monkey's.
FLAC for its light CPU use while playing, and Monkey's for its great enc/dec speed (consumes lots of processing f/ playing tho).

But I'm currently ripping to FLAC, as this have attended all my needs just neatly! smile.gif

This post has been edited by alex_wheels: Aug 15 2004, 13:04
Go to the top of the page
+Quote Post
jcoalson
post Aug 15 2004, 21:42
Post #63


FLAC Developer


Group: Developer
Posts: 1526
Joined: 27-February 02
Member No.: 1408



QUOTE (Polar @ Aug 14 2004, 05:06 PM)
Still curious to know, Josh: any idea as to why Omion's encodings at level -2 decoded on average slightly faster than -1 and -0?

the only reason I can think of is the reduced file size of -2.

Josh
Go to the top of the page
+Quote Post
Polar
post Aug 16 2004, 18:07
Post #64





Group: Members
Posts: 266
Joined: 12-February 04
Member No.: 11970



QUOTE (Omion @ Aug 14 2004, 22:31 UTC)
I'm doing another FLAC test right now on a bunch of my audio (probably ~20 albums, or until I get bored), but I don't have enough space to do them all at once so it could take a while.
*
Any progress?

Edit: fixed quote.

This post has been edited by Polar: Aug 17 2004, 07:25
Go to the top of the page
+Quote Post
Omion
post Aug 17 2004, 01:56
Post #65





Group: Developer
Posts: 432
Joined: 22-February 04
From: San Diego, CA
Member No.: 12180



I've got two albums done. That stupid "2-state" thing I talked about in my last post is still there (or came back, or whatever), but I'm just going to test so many samples that it won't matter. The current results are:
(Figure out the labels yourself tongue.gif )
A bit different from the last one, although it's still quite incomplete. You can see that -2 is still slightly faster than -1, but not as fast as -0. -8 didn't have the same drop as it did in the last test, so I might have to eat my words about -8 not bieng a 'deviant result' blush.gif. Oh well, still a lot more albums to go.

BTW: Encoding one album to all the different levels takes almost exactly 10 hours on my computer... blink.gif Most of that time is taken up by --super-secret.


--------------------
"We demand rigidly defined areas of doubt and uncertainty!" - Vroomfondel, H2G2
Go to the top of the page
+Quote Post
realmax
post Aug 17 2004, 05:04
Post #66





Group: Members
Posts: 5
Joined: 24-July 04
Member No.: 15728



FlAC format sounds like a good lossess codec.
But in Taiwan,it is not as popular as APE format(monkey's audio).
I hope Flac format will become more and more popular in Taiwan. rolleyes.gif
Go to the top of the page
+Quote Post
Polar
post Aug 17 2004, 08:00
Post #67





Group: Members
Posts: 266
Joined: 12-February 04
Member No.: 11970



QUOTE (Omion @ Aug 17 2004, 00:56 UTC)
I've got two albums done.
(...)
A bit different from the last one, although it's still quite incomplete. You can see that -2 is still slightly faster than -1, but not as fast as -0. -8 didn't have the same drop as it did in the last test, so I might have to eat my words about -8 not bieng a 'deviant result' blush.gif. Oh well, still a lot more albums to go.
*
Veeeery interesting to read. All of it, I mean. Are you still giving each of the 10 compression levels per album 12 decodings jobs? Great work! Eagerly awaiting more of your test results.
If you don't mind, I'll repeat my question: are you planning on putting those stats online somewhere? I'm confident that I won't be the only one who'd appreciate that.

QUOTE (Omion @ Aug 17 2004, 00:56 UTC)
BTW: Encoding one album to all the different levels takes almost exactly 10 hours on my computer... blink.gif Most of that time is taken up by --super-secret.
*
Yeah, it's no coincidence that it's called --super-secret-totally-impractical-compression-level wink.gif You've probably considered all that already, but have you thought about compressing all of the albums @ --ssticl overnight? While ripping and encoding @ -0 up till -8 during the day? What's your strategy?
Go to the top of the page
+Quote Post
Omion
post Aug 17 2004, 17:43
Post #68





Group: Developer
Posts: 432
Joined: 22-February 04
From: San Diego, CA
Member No.: 12180



Yes, I'm doing 12 decodings per level. It takes around 90 minutes to decode everything. (45 seconds each * 10 compression levels * 12 runs)

I'll probably put it up on my university-provided web page when I get done.

And yes, I do all the compressing overnight. I have everything already ripped from CD, so I just start a bunch of compression sessions before I go to bed, and most of them are done when I wake up. Then I do the test when I have 2 hours that I need to be away from the computer.

[edit] Just finished album #4. Oddly enough, the -0 file was SMALLER than the -1. Re-encoded to confirm, and indeed -0 was 100KiB smaller. -0 was 347,642,238 bytes, -1 was 347,745,628. Weird.
The only difference between -0 and -1 is that -1 uses --adaptive-mid-side. The album has a lot of stereo separation, so the overhead caused by the MS coding might outweigh the benifits. (just a wild guess, though. Not sure what/if there is any overhead in MS files.)

[edit Aug 19]Done with 6...

The 6th one was weird... Here's a graph of it alone:

Yes, that is right. -3 did almost as well as --ss, which did worse than -7 or -8. The -8 sample averaged 415 kbps! blink.gif

This post has been edited by Omion: Aug 20 2004, 05:49


--------------------
"We demand rigidly defined areas of doubt and uncertainty!" - Vroomfondel, H2G2
Go to the top of the page
+Quote Post
Polar
post Aug 20 2004, 09:01
Post #69





Group: Members
Posts: 266
Joined: 12-February 04
Member No.: 11970



QUOTE (Omion @ Aug 17 2004, 16:43 UTC)
I'll probably put it up on my university-provided web page when I get done.
*
Great smile.gif Looking forward to that.

QUOTE (Omion @ Aug 20 2004, 04:49 UTC)
Done with 6...
(snip)
The 6th one was weird... Here's a graph of it alone:
(snip)
Yes, that is right. -3 did almost as well as --ss, which did worse than -7 or -8. The -8 sample averaged 415 kbps! blink.gif
*
Some or other classical symphonic work, I suppose? Pretty neat smile.gif Which goes to show that lossless is especially interesting for classical music lovers. Master quality at hardly double the bitrate of high quality lossy.

So I guess it's starting to show more and more that it's safe to say those 8 songs were too narrow a base to be anything of an authoritative graduator. Which still leaves us with that -2 anomaly though. But then again, what's an anomaly if your 6 albums' average graph indicates that the decoding speed difference between -1 and -2 is a less than 0.5%? Same could be said of -4 then, which decodes just that tad slower than -5, -6 and -7, but on par with -8. Shall we label that the poetic justice of statistics? wink.gif
Go to the top of the page
+Quote Post
Omion
post Aug 20 2004, 19:20
Post #70





Group: Developer
Posts: 432
Joined: 22-February 04
From: San Diego, CA
Member No.: 12180



QUOTE (Polar @ Aug 20 2004, 01:01 AM)
Some or other classical symphonic work, I suppose? Pretty neat smile.gif  Which goes to show that lossless is especially interesting for classical music lovers. Master quality at hardly double the bitrate of high quality lossy.

So I guess it's starting to show more and more that it's safe to say those 8 songs were too narrow a base to be anything of an authoritative graduator. Which still leaves us with that -2 anomaly though. But then again, what's an anomaly if your 6 albums' average graph indicates that the decoding speed difference between -1 and -2 is a less than 0.5%? Same could be said of -4 then, which decodes just that tad slower than -5, -6 and -7, but on par with -8. Shall we label that the poetic justice of statistics? wink.gif
*

#6 was "4 Klaviersonaten," which consists of 4 piano works by Beethoven, performed by Maurizio Pollini.

I agree that the last test I did was not extensive enough. So anybody that formed any opinions based on my previous test should re-form them based on this one, when it gets done. (I wonder if anybody else is reading this anymore? wink.gif )

Also, I'm beginning to think that the error bars were a bad idea. The problem is that for most of the samples have the same relative rating, but different absolute. For example:
test 1:
-4 - 50x
-7 - 55x
test 2:
-4 - 60x
-7 - 65x

In this case, the error bars would be large and probably overlapping, since the tests were different by 10x. But it would still be safe to say that -7 is faster than -4, even though the error bars overlap. I'm not sure what to do in this case.


--------------------
"We demand rigidly defined areas of doubt and uncertainty!" - Vroomfondel, H2G2
Go to the top of the page
+Quote Post
Omion
post Aug 23 2004, 22:23
Post #71





Group: Developer
Posts: 432
Joined: 22-February 04
From: San Diego, CA
Member No.: 12180



Just ran three more tests last night, and I think I figured out a way to solve the misleading error bars I mentioned in my last post.

I made a function that will normalize all the speeds based on a "pivot" encoding. So if a test was:
-6 - 40x
-7 - 50x
-8 - 60x

and I normalized to -7, the result would be:
-6 - 0.8
-7 - 1.0
-8 - 1.2

This will prevent the erroniously overlapping error bars, and will give good results for relative speeds.
9 albums:

From this picture, it looks like -7 and -8 are statistically tied for decoding speed (*). However...

Normalize to -7:

You can see that the error bar of -8 is entirely below 1.0 (where the horiz. axis is), indicating that there's >95% chance that -7 will decode faster than -8.

The only problem with this picture is that it says nothing about the relative speeds of -1 vs -2, just vs. -7. Ten graphs will need to be made in order for everything to be comparable.

(*) Well, I suppose they are statistically tied. It's just that the relative speeds for a given sample are not conveyed with the graph

I've been thinking that this is pretty off-topic by now. I would not be crushed if a moderator were to split all of this off, if one were wont to do such a thing.

[edit Aug 30 2004]I finished the test. It has 14 albums, for a total of 14 hours, 28 minutes, and 13.8 seconds. Find it on this thread.

This post has been edited by Omion: Aug 31 2004, 06:50


--------------------
"We demand rigidly defined areas of doubt and uncertainty!" - Vroomfondel, H2G2
Go to the top of the page
+Quote Post
Grand Dizzy
post Oct 24 2004, 19:56
Post #72





Group: Members
Posts: 118
Joined: 3-November 03
Member No.: 9637



From the response to this poll it seems like almost everyone here uses some kind of lossless compression. But I thought everyone here was into mp3s, LAME etc. You can't use both, can you? Or do you? If so, why? I'm a bit confused.
Go to the top of the page
+Quote Post
sehested
post Oct 24 2004, 21:04
Post #73





Group: Members (Donating)
Posts: 325
Joined: 5-April 04
From: Copenhagen, Denmark
Member No.: 13246



Many of the people in this forum prefers lossless for archiving and streaming to their stereo.

Some of these people also have portables that do not have the capacity to hold enough lossless songs. Furthermore lossy formats like LAME aps or even 128 kbps AAC or lame -V5 produce files that are 10 times smaller and allow you to enjoy the music without annoying artifacts.

Furthermore as lossy codecs improve there is a desire to reencode the songs as new versions of lossy encoders become avaible.

With the right software an entire music collection stored in lossless can be encoded automatically, without having to re-rip the CDs
Go to the top of the page
+Quote Post
Grand Dizzy
post Oct 24 2004, 21:58
Post #74





Group: Members
Posts: 118
Joined: 3-November 03
Member No.: 9637



Very interesting!

Hmm... I just buy CDs, encode everything at alt-preset-extreme, then give away the CD! Guess the day will come when I wish I'd been keeping a lossless copy on my hard drive!

Or maybe not. I mean, I can't tell the difference between alt-preset-extreme and the original CD, so maybe it's not such a big deal I gave those CDs away.
Go to the top of the page
+Quote Post
user
post Nov 25 2004, 01:09
Post #75





Group: Members
Posts: 873
Joined: 12-October 01
From: the great wide open
Member No.: 277



I switched (and voted for wv) to Wavpack 4.1 high mode -h -m, coming from Flac.
Ca. 3% better compression than flac in -5 default mode.
Wavpack 4.1 in high mode offers still fast speed even on P3 600 - 800 MHz.

Somebody asked, why a lot people use Lossless and lossy.
My reason: a backup ! Too many persons I know, have had HD crashes, loss of data. So, I went the way of storing my music on DVD+R,
My strategy:
1 Lossless version on 1 DVD, the transparent MPC --quality 8 --ms 15 --xlevel version on another DVD.
Both with par2 data. The safety backup MPC version doesn't cost anything, which you could count in "storage space" or "money".
Together with MAC 2.93.1 (Mpeg audio collection), it is a quicky, to pull out the right DVD out of the 312 disc box, and to copy the desired album(s) to laptop, PC or burn a CDDA.

This post has been edited by user: Nov 25 2004, 01:17


--------------------
www.High-Quality.ch.vu -- High Quality Audio Archiving Tutorials
Go to the top of the page
+Quote Post

5 Pages V  < 1 2 3 4 5 >
Closed TopicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 



RSS Lo-Fi Version Time is now: 28th July 2014 - 18:34