IPB

Welcome Guest ( Log In | Register )

"Dithering" over reducing the bitrates of my collection, Should i reduce my 24/96 surround material to 16/44.1?
BearcatSandor
post May 2 2012, 09:27
Post #1





Group: Members
Posts: 149
Joined: 18-May 10
From: Montana, USA
Member No.: 80732



Folks,

I've got quite a bit of 24/96 material on my computer that is in 5.1 surround. Given that most of us accept it as true that 16/44.1 vs 24/96 makes no audiable difference, is there any reason not to dither these files down to 16/44.1 in 5.1 channels?

It would save me disc space, reduce my network activity and also reduce my CPUs cock cycles.

Thanks


--------------------
Music lover and recovering high end audiophile
Go to the top of the page
+Quote Post
 
Start new topic
Replies
2Bdecided
post May 2 2012, 12:19
Post #2


ReplayGain developer


Group: Developer
Posts: 5135
Joined: 5-November 01
From: Yorkshire, UK
Member No.: 409



If it's lossy, don't touch it.

If it's uncompressed, use lossless compression.

If it's already lossless, you could save a surprising amount of space by remove the last four bits (i.e. making it 20-bit). It's hard to imagine any scenario (measurement, analysis, compression, never mind normal listening) where this would be detectable.

While 48kHz/20-bit, 48kHz 16-bit, and 44.1kHz 16-bit should all also be safe, I wouldn't do it, except for compatibility. I like having 24/96 content around to demonstrate that it sounds the same as 44.1kHz/16 wink.gif

With lossless compression, the difference in bitrate between 20/96 and 20/48 won't be as great as 2:1.

Cheers,
David.
Go to the top of the page
+Quote Post
pdq
post May 2 2012, 13:01
Post #3





Group: Members
Posts: 3404
Joined: 1-September 05
From: SE Pennsylvania
Member No.: 24233



QUOTE (2Bdecided @ May 2 2012, 07:19) *
With lossless compression, the difference in bitrate between 20/96 and 20/48 won't be as great as 2:1.

Yes, but wouldn't you expect the savings going from 24 bit to 16 bit to be more than a third?
Go to the top of the page
+Quote Post
2Bdecided
post May 2 2012, 14:33
Post #4


ReplayGain developer


Group: Developer
Posts: 5135
Joined: 5-November 01
From: Yorkshire, UK
Member No.: 409



QUOTE (pdq @ May 2 2012, 13:01) *
QUOTE (2Bdecided @ May 2 2012, 07:19) *
With lossless compression, the difference in bitrate between 20/96 and 20/48 won't be as great as 2:1.

Yes, but wouldn't you expect the savings going from 24 bit to 16 bit to be more than a third?
Yes. Much more.

However, if someone releases a 24/96 5.1 mix with buckets of headroom, I wouldn't feel 100% confident saying that converting 24-bits to 16-bits would be inaudible in every possible circumstance. I'd struggle to find any content that I could ABX, but it's not impossible that something somewhere is recorded/mixed so quietly that the difference can be heard. If anyone mixes 5.1 at the final Dolby reference level (dial norm -31dB), that's obscenely quiet by CD standards.

Cheers,
David.
Go to the top of the page
+Quote Post
Porcus
post May 2 2012, 17:58
Post #5





Group: Members
Posts: 1842
Joined: 30-November 06
Member No.: 38207



QUOTE (2Bdecided @ May 2 2012, 15:33) *
QUOTE (pdq @ May 2 2012, 13:01) *
QUOTE (2Bdecided @ May 2 2012, 07:19) *
With lossless compression, the difference in bitrate between 20/96 and 20/48 won't be as great as 2:1.

Yes, but wouldn't you expect the savings going from 24 bit to 16 bit to be more than a third?
Yes. Much more.


2Bdecided is right. I have experienced forty percent and more, if all the last 8 bits are used. Because then they are mainly "used" for noise, and noise is hard to compress.


--------------------
One day in the Year of the Fox came a time remembered well
Go to the top of the page
+Quote Post

Posts in this topic


Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 



RSS Lo-Fi Version Time is now: 17th September 2014 - 23:09