IPB

Welcome Guest ( Log In | Register )

 
Reply to this topicStart new topic
Question regarding lossless audio compression
oldpickup
post May 16 2003, 09:43
Post #1





Group: Members
Posts: 2
Joined: 16-May 03
Member No.: 6650



I am somewhat ignorant regarding lossless compression codecs, and I was wondering about some of ya'lls opinion on the sound quality of some of the codecs. I have tried FLAC and Wavpack, and I liked the quality of both of them, but I couldn't really tell the difference between the two.

Also, in ya'lls opinion, is it really worth using a lossless compression codec instead of using say Ogg Vorbis or MPC?

Thanks in advance.

This post has been edited by oldpickup: May 16 2003, 11:38
Go to the top of the page
+Quote Post
Delirium
post May 16 2003, 11:18
Post #2





Group: Members
Posts: 300
Joined: 3-January 02
From: Santa Cruz, CA
Member No.: 891



The sound quality is by definition identical. Lossless means that there is no quality loss at all; what you're hearing is exactly the same as the original WAV file. It's like asking whether PKZip or GZip gives you better quality -- neither, they both give you the exact same results. The only differences are speed, compression ratio, source code openness / patent issues, and software/hardware support.
Go to the top of the page
+Quote Post
oldpickup
post May 16 2003, 11:39
Post #3





Group: Members
Posts: 2
Joined: 16-May 03
Member No.: 6650



Hmm, interesting. Thanks for the thought.
Go to the top of the page
+Quote Post
Annuka
post May 16 2003, 11:56
Post #4





Group: Members
Posts: 333
Joined: 2-February 02
Member No.: 1233



QUOTE (oldpickup @ May 16 2003 - 10:43 AM)
Also, in ya'lls opinion, is it really worth using a lossless compression codec instead of using say Ogg Vorbis or MPC?

The cost of using a lossless format is quite easy to calculate: 1,000 hours of lossless music ~ 360 Gb primary storage + backup ~ €1,000 for 7x 120 Gb disks - raid5 primary and raid0 backup.

Some benefits below:

* You will never have to worry about your lossy codec not being able to encode correctly. If you have a well-paid job and prefer to work instead of re-ripping and re-encoding, this might be an important factor for me.

* It migt feel good/right to know you have the original music. Some people will actually destroy a near-perfect copy of a painting (they have owned for 20 years) after they discover it is a copy...

* You might *think* you can hear a difference or that it brings more joy to your lsitening experinece. Whether or not this is the actual case is completely irrelevant as long as you think it does.

So to answer your question, is it really worth it? For me it is, but it might not be worth it for you.
Go to the top of the page
+Quote Post
dillee1
post May 16 2003, 14:00
Post #5





Group: Members
Posts: 27
Joined: 20-April 03
Member No.: 6075



actually how to pull a sharpe line btw lossy and lostless??
even re-encoding Linear Integer PCM to ADPCM reduce the dynamic range and might introduce artifacts, thought the transfrom should be consider as lostless.
codecs like DTS blurs the issue even further, where the aim of the codec is not to reduce bit rate but to increase bandwidth and dynamic range. It does use a large quantizer at high frequency and lower quantizer for low - middle frequencies however.
May be those use only entropy encoding as compression scheme be the true lossless codec? blink.gif
Go to the top of the page
+Quote Post
AstralStorm
post May 16 2003, 14:39
Post #6





Group: Members
Posts: 745
Joined: 22-April 03
From: /dev/null
Member No.: 6130



Lossy = it doesn't produce output bit identical to the original.
ADPCM, DTS, AC-3, MP3, Vorbis, GSM, Speex are lossy codecs.
MLP, FLAC, Liquid Audio, Monkey's Audio, LPAC are lossless.


--------------------
ruxvilti'a
Go to the top of the page
+Quote Post
SafirXP
post May 16 2003, 15:32
Post #7





Group: Members
Posts: 71
Joined: 21-April 03
From: Dhaka
Member No.: 6104



any of u guys actually use ONLY lossless formats? cause if i had a huge CD collection i might think of backing them up using a lossless codec. for me that'd be the only reason to use a lossless format. but then i'd have to worry about space! guess lossless formats would be more feasible when dvd±rw becomes affordable!


--------------------
everybody's a jerk. you, me, this jerk!
Go to the top of the page
+Quote Post
gdougherty
post May 16 2003, 16:39
Post #8





Group: Members
Posts: 195
Joined: 10-February 02
From: One Mile Up
Member No.: 1299



Just picked up a 200GB hard drive to increase my storage space. As drive come down in price and increase in size things are becoming more affordable and lossless is becoming more viable as a format for all my audio. By adding a second 200GB drive I'll be able to store about 1200 CD's and at the point that my drive starts to make noise I'll be backing up to another drive. As is I'm on a UPS with AVR and my system remains quite stable. New drives have fluid bearings so reliability goes up a bit. Only big gapping weakness is off-site storage in the event of an apartment fire. I'm re-ripping from my roommates' CD's, but leaving my own music in MPC at the moment to conserve space. While I have alot of space at the moment I don't have enough to hold everything in lossless.
Go to the top of the page
+Quote Post
wynlyndd
post May 16 2003, 17:00
Post #9





Group: Members (Donating)
Posts: 89
Joined: 25-March 03
From: Houston, TX
Member No.: 5654



QUOTE (gdougherty @ May 16 2003 - 10:39 AM)
While I have alot of space at the moment I don't have enough to hold everything in lossless.

I am going to solve this by using dvd+-r and check periodically to see if the media is deteriorating. Maybe I'll wait for those blue-ray dvds to come down to a reasonable price.

mmmmm 36gb on a disk.... *drool*


--------------------
"Droplets of Yes and No, in an ocean of Maybe"
Go to the top of the page
+Quote Post
Annuka
post May 16 2003, 22:56
Post #10





Group: Members
Posts: 333
Joined: 2-February 02
Member No.: 1233



QUOTE (SafirXP @ May 16 2003 - 04:32 PM)
any of u guys actually use ONLY lossless formats?

Almost. I have some 675 albums in FLAC, some 20 albums in vorbis -q8 or mpc braindead and two albums in mp3.

The mp3 are rare albums out-of-print. I would buy them if I could.

The "some 20" other lossy albums were downloaded from allofmp3.com. I consider them preview versions until I get the real thing. Either from a library or from eBay/amazon marketplace. I do not buy new cds anymore. I know that the majority thinks that mpc braindead and vorbis -q8 is overkill, but what do I care...
Go to the top of the page
+Quote Post
dillee1
post May 17 2003, 10:22
Post #11





Group: Members
Posts: 27
Joined: 20-April 03
Member No.: 6075



QUOTE (AstralStorm @ May 16 2003 - 05:39 AM)
Lossy = it doesn't produce output bit identical to the original.
ADPCM, DTS, AC-3, MP3, Vorbis, GSM, Speex are lossy codecs.
MLP, FLAC, Liquid Audio, Monkey's Audio, LPAC are lossless.

ADPCM can be convert back to a bit-perfect Linear integer PCM as lost as the transent is not huge.
Under some circumtances DTS can produce a bit perfect original Linear PCM as well.

I am not very sure ab the nature of Monkey's Audio and LPAC, but only entropy encoding will produce prefect decompressed stream at 100% the time. As long as you resample / FT to frequency domain / change sample format-depth, there is always chance of introducing artifacts during reconstruction.

Pure entropy encoding do not compress well however.
Go to the top of the page
+Quote Post
Delirium
post May 17 2003, 10:30
Post #12





Group: Members
Posts: 300
Joined: 3-January 02
From: Santa Cruz, CA
Member No.: 891



QUOTE (dillee1 @ May 17 2003 - 02:22 AM)
I am not very sure ab the nature of Monkey's Audio and LPAC, but only entropy encoding will produce prefect decompressed stream at 100% the time. As long as you resample / FT to frequency domain / change sample format-depth, there is always chance of introducing artifacts during reconstruction.

Pure entropy encoding do not compress well however.

I don't know about the others, but FLAC uses LPC and some fixed predictors to compress based on sample-to-sample correlations, and then uses entropy coding to code the LPC coefficients and the residuals.

The other methods you mentions can be lossless as well as long as you always store a residual (and use integer operations for everything, since floating-point varies on different hardware).
Go to the top of the page
+Quote Post
dewey1973
post May 17 2003, 16:37
Post #13





Group: Members
Posts: 383
Joined: 31-March 03
From: Seattle, WA
Member No.: 5771



I'm puzzled by one thing. I would think that a compression format where you can't play the file without decompressing (like .zip, .sit, etc.) would be able to compress a file more than current lossless formats that have to leave the file playable. I understand that lossless formats are tuned for sound and that's why they compress a .wav better than .zip, .sit, etc. Couldn't someone create an audio tuned compression format, where you can't play the file, that would have much better compression than current lossless formats? All I want to do is archive source material that I may not have access to in the future. A format like .zip tuned for audio would work great for me if the compression was better than formats that you can play.

I'd be interested to hear whether this is possible, whether it already exists, or whether I'm just a kook! wink.gif
Go to the top of the page
+Quote Post
bryant
post May 17 2003, 18:40
Post #14


WavPack Developer


Group: Developer (Donating)
Posts: 1297
Joined: 3-January 02
From: San Francisco CA
Member No.: 900



The lossless compressors that achieve the highest compression ratio (LA and OptimFROG) are probably very close to achieving the maximum compression possible with real music (even though they may not perform optimally with special cases). They are not sacrificing any significant compression to remain "playable" (in fact they are not playable on my [old] PC tongue.gif ).
Go to the top of the page
+Quote Post
NumLOCK
post May 17 2003, 19:06
Post #15


Neutrino G-RSA developer


Group: Developer
Posts: 852
Joined: 8-May 02
From: Geneva
Member No.: 2002



QUOTE (dewey1973 @ May 17 2003 - 04:37 PM)
I'm puzzled by one thing.  I would think that a compression format where you can't play the file without decompressing (like .zip, .sit, etc.) would be able to compress a file more than current lossless formats that have to leave the file playable.  I understand that lossless formats are tuned for sound and that's why they compress a .wav better than .zip, .sit, etc.  Couldn't someone create an audio tuned compression format, where you can't play the file, that would have much better compression than current lossless formats?  All I want to do is archive source material that I may not have access to in the future.  A format like .zip tuned for audio would work great for me if the compression was better than formats that you can play.

I'd be interested to hear whether this is possible, whether it already exists, or whether I'm just a kook!  wink.gif

The highest possible lossless compression of any data (in theory) can be achieved using a perfect forward predictor and arithmetic coding. Since it relies on forward prediction you can still play the stream without problems (for faster seeking you must trade some compression ratio though).


--------------------
Try Leeloo Chat at http://leeloo.webhop.net
Go to the top of the page
+Quote Post
Pio2001
post May 17 2003, 19:21
Post #16


Moderator


Group: Super Moderator
Posts: 3936
Joined: 29-September 01
Member No.: 73



QUOTE (dillee1 @ May 17 2003 - 12:22 PM)
only entropy encoding will produce prefect decompressed stream at 100% the time. As long as you resample / FT to frequency domain / change sample format-depth, there is always chance of introducing artifacts during reconstruction.

Lossless codecs don't resample, nor FT, the sample depht or format are meaningless since they become compressed.
Example : "0000" can become "4x0", that is a 3:4 compression ratio.
Original sample format : one character per sample. Final format : ???
Go to the top of the page
+Quote Post
Delirium
post May 17 2003, 21:13
Post #17





Group: Members
Posts: 300
Joined: 3-January 02
From: Santa Cruz, CA
Member No.: 891



QUOTE
Lossless codecs don't resample, nor FT, the sample depht or format are meaningless since they become compressed.
Example : "0000" can become "4x0", that is a 3:4 compression ratio.
Original sample format : one character per sample. Final format : ???


Some lossless codecs transform; there is at least one that uses an MDCT (I have no reference at the moment, but I think one of the papers linked to on the FLAC site does). It energy-compresses by using an MDCT, and then stores residuals for the portion of the signal the MDCT doesn't perfectly capture.

QUOTE
The highest possible lossless compression of any data (in theory) can be achieved using a perfect forward predictor and arithmetic coding. Since it relies on forward prediction you can still play the stream without problems (for faster seeking you must trade some compression ratio though).


I'm not an expert in this field, so I can't say that's wrong, but it seems off the top of my head that there should be at least some cases in which a forward predictor (even a perfect one) is less than optimal. Other sources of redundancy could be found by other things, such as taking future samples into account (I think?).
Go to the top of the page
+Quote Post
NumLOCK
post May 17 2003, 21:28
Post #18


Neutrino G-RSA developer


Group: Developer
Posts: 852
Joined: 8-May 02
From: Geneva
Member No.: 2002



QUOTE
I'm not an expert in this field, so I can't say that's wrong, but it seems off the top of my head that there should be at least some cases in which a forward predictor (even a perfect one) is less than optimal. Other sources of redundancy could be found by other things, such as taking future samples into account (I think?).


It's kind of a chicken-egg problem... if your goal is to predict future data, you cannot "cheat" and exploit the future data for that purpose biggrin.gif

In terms of compression ratio, forward predictor + arithmetic coding has no limits. If your predictor is very good, there are certain kinds of data (sequences of zeroes, etc) where you can literally compress a kilobyte in less than a bit ! I have already done it.

In practice however, the predictor has to take a risk. You guess the next bit. If you take no risk, you gain (or lose) nothing. If you bet and guess right, you gain fractions of bits. Otherwise you lose space. The sum of all this determines the overall compression ratio..

The other way is to take the whole data as a chunk, but then it's not called "prediction"... and when you do it, you produce one "chunk" which must be decompressed in a whole.


--------------------
Try Leeloo Chat at http://leeloo.webhop.net
Go to the top of the page
+Quote Post
Pio2001
post May 17 2003, 23:02
Post #19


Moderator


Group: Super Moderator
Posts: 3936
Joined: 29-September 01
Member No.: 73



QUOTE (Delirium @ May 17 2003 - 11:13 PM)
It energy-compresses by using an MDCT, and then stores residuals for the portion of the signal the MDCT doesn't perfectly capture.

Yes, this way, you can in fact use a lossy compresion, and store losses.

lossy+losses=lossless
Go to the top of the page
+Quote Post
jmvalin
post May 17 2003, 23:28
Post #20


Xiph.org Speex developer


Group: Developer
Posts: 487
Joined: 21-August 02
Member No.: 3134



QUOTE (NumLOCK @ May 17 2003 - 03:28 PM)
It's kind of a chicken-egg problem...  if your goal is to predict future data, you cannot "cheat" and exploit the future data for that purpose  biggrin.gif

Sure you can... as long as you encode the predictor too. That's what CELP codecs do (including Speex).
Go to the top of the page
+Quote Post
NumLOCK
post May 18 2003, 00:17
Post #21


Neutrino G-RSA developer


Group: Developer
Posts: 852
Joined: 8-May 02
From: Geneva
Member No.: 2002



jmvalin:

Ok that's correct, you can have a look at the future data, but whatever useful info you want to keep, you have to store in the output file (so the unpacker has access to it, too).

There are many possible strategies, from fully adaptive to package-transform approaches, but what you gain somewhere you lose elsewhere..

Most of the possible approaches (including some very simple ones) can give asymptotically optimal compression - but the faster algorithms are often hybrid constructions (ie: never fully adaptive, never fully table-driven) wink.gif

This post has been edited by NumLOCK: May 18 2003, 00:21


--------------------
Try Leeloo Chat at http://leeloo.webhop.net
Go to the top of the page
+Quote Post

Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 



RSS Lo-Fi Version Time is now: 28th December 2014 - 15:29