Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: I have "Golden Ears" Which AAC VBR Bitrate is acceptable? (Read 12753 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

I have "Golden Ears" Which AAC VBR Bitrate is acceptable?

So, yes, I have the "golden ears" everyone refers too...

I can actually hear the difference between FLAC, and MP3 320 kbps

I can hear the difference between 1,536 Kbps DTS, 800 Kbps VBR AAC, and 384/640 Kbps AC3 when I watch films, and my equipment isn't very fancy either


I just finished transcoding my FLAC discography of Eminem to 128 Kbps VBR AAC

I am thinking of retranscoding (again, from flac) to 192 Kbps or 320 Kbps VBR AAC
but is it necessary? (this is going to be for playback from my ipod touch)

My FLAC files look something like this...
Format/Info                              : Free Lossless Audio Codec
Duration                                : 6mn 42s
Bit rate mode                            : Variable
Bit rate                                : 2 849 Kbps
Channel(s)                              : 2 channels
Sampling rate                            : 96.0 KHz
Bit depth                                : 24 bits
Stream size                              : 137 MiB (100%)

I have "Golden Ears" Which AAC VBR Bitrate is acceptable?

Reply #1
I can hear the difference between 1,536 Kbps DTS, 800 Kbps VBR AAC, and 384/640 Kbps AC3 when I watch films, and my equipment isn't very fancy either
More often than not DTS and AC3 tracks on a movie medium are based on different masters, making them unsuitable for comparison.

I am thinking of retranscoding (again, from flac) to 192 Kbps or 320 Kbps VBR AAC
but is it necessary? (this is going to be for playback from my ipod touch)
Use ABX tests to determine your threshold for AAC encoding. There is no golden setting for everybody, especially not if you are able to routinely identify artifacts at high bitrates.

My FLAC files look something like this...
Format/Info                            : Free Lossless Audio Codec
Duration                                : 5mn 13s
Bit rate mode                            : Variable
Bit rate                                : 2 726 Kbps
Channel(s)                              : 2 channels
Sampling rate                            : 96.0 KHz
Bit depth                                : 24 bits
These files are based on DVD-A rips? If these are CD rips you should seek to replace them with proper CD rips.
It's only audiophile if it's inconvenient.

I have "Golden Ears" Which AAC VBR Bitrate is acceptable?

Reply #2
I have 2 of every album, one has these insane stats ^

and the other with these kind of stats

Format/Info                              : Free Lossless Audio Codec
Duration                                : 5mn 46s
Bit rate mode                            : Variable
Bit rate                                : 862 Kbps
Channel(s)                              : 2 channels
Sampling rate                            : 44.1 KHz
Bit depth                                : 16 bits
Stream size                              : 35.6 MiB (100%)

I have "Golden Ears" Which AAC VBR Bitrate is acceptable?

Reply #3
So, one of them is (likely) a CD rip and the other is based off of some other source.
It's only audiophile if it's inconvenient.

I have "Golden Ears" Which AAC VBR Bitrate is acceptable?

Reply #4
Do a proper abx test. Forget current trends, dont look at bitrates and spectrograms . Once things get difficult you stick with that setting or go *a little* higher - If 128k aac gives good results then: encode with 128..150..170  etc

I have "Golden Ears" Which AAC VBR Bitrate is acceptable?

Reply #5
So, yes, I have the "golden ears" everyone refers too...

I can actually hear the difference between FLAC, and MP3 320 kbps

I can hear the difference between 1,536 Kbps DTS, 800 Kbps VBR AAC, and 384/640 Kbps AC3 when I watch films, and my equipment isn't very fancy either
If you don’t have valid objective evidence to support these fanciful claims, in line with #8 of the terms of service to which you agreed during registration, your statements are unwelcome and your thread is at risk of being binned without notice.

I have "Golden Ears" Which AAC VBR Bitrate is acceptable?

Reply #6
Quote
So, yes, I have the "golden ears" everyone refers too...
In that case, only YOU can determine what YOU can hear!

If you ears are really that good, or if you are "paranoid", or if you are simply not concerned with file size, just go-ahead use the best possible AAC setting and try not to worry about it.  Or if you have the storage space, go with ALAC.  (Your iPod won't play FLAC, but it should play ALAC.)

Quote
... and my equipment isn't very fancy either
You don't need high-end equipment to hear compression artifacts.  If you hear artifacts on high-end equipment, you can probably hear them on an average system.

But, program material DOES make a difference.  (Some songs are "easier" to compress than others.)

I have "Golden Ears" Which AAC VBR Bitrate is acceptable?

Reply #7
KmanKaiser, I thought I had golden ears as well (I was able to tell the difference between 320 CBR LAME and 320 CBR Fraunhofer with a 4 hour ABX test), until I started ABX everything and I didn't care anymore, ABX will destroy your ego.

I reconverted everything in FLAC for archival purposes and cared about size only for portable use, ending up using ~115 Kbps AAC True VBR for my iPhone and car.

Always, always ABX.

I have "Golden Ears" Which AAC VBR Bitrate is acceptable?

Reply #8
I was able to tell the difference between 320 CBR LAME and 320 CBR Fraunhofer with a 4 hour ABX test

You mean this topic of yours that was binned because you failed to provide evidence in accordance with TOS #8?

I have "Golden Ears" Which AAC VBR Bitrate is acceptable?

Reply #9
Moreover, comparing one lossy format to another (an apple to an orange) is pretty pointless. I might grant it a minimal amount of validity in cases where one absolutely cannot obtain a non-compressed version to use as a proper control—but, basically, no.


I have "Golden Ears" Which AAC VBR Bitrate is acceptable?

Reply #11
And if comparing two lossy encodings, it could be one or both, with a positive result indicating only that one sounds more lossy. Or something


I have "Golden Ears" Which AAC VBR Bitrate is acceptable?

Reply #13
A passed ABX only guarantees that one of them is not transparent from the original source.


The more I think of that statement, the more interesting it gets (although I guess, not in practice, where I presume that the codecs would share some artifacts, making ABXing them harder, or at least no easier, than original-to-lossy).

But in principle, the artifacts could be disjoint, giving you “twice as many” artifacts to detect -- or even worse (though even less likely in practice), it could “double” an artifact -- say, one note getting a treble boost not noticeable until you compare to the other codec which has a treble cut on the same note; say +D to 0 is inaudible, -d to 0 is inaudible, +D to -d is.

Conditioned upon this being the situation (not too likely I'd say, but for the sake of the argument), claiming “one of them is not transparent” is the inference that “in this signal, D to 0 is audible or 0 to (-d) is audible” from the observation “in this signal, D to -d is audible”. Or, simplifying to absolute value terms, “max{D,d} is audible” from “D+d is audible”. What is the type I/type II trade-off, and how does that compare to the likely low N used in ABXing each to the original?

(In the more realistic case where artifacts are shared, the inference would be “in this signal, max{D,d} is audible” from “in this signal, |D-d| is audible”, and that is ... not too objectionable.)

I have "Golden Ears" Which AAC VBR Bitrate is acceptable?

Reply #14
Moreover, comparing one lossy format to another (an apple to an orange) is pretty pointless.

Comparing a lossy encoding from WAV with the same lossy encoding from e.g. lossyWAV is quite interesting IMO, (assuming lossyWAV from WAV is transparent). But strictly "comparing one lossy format to another" is rather pointless, agreed.

I have "Golden Ears" Which AAC VBR Bitrate is acceptable?

Reply #15

I have "Golden Ears" Which AAC VBR Bitrate is acceptable?

Reply #16
But in principle, the artifacts could be disjoint, giving you “twice as many” artifacts to detect -- or even worse (though even less likely in practice), it could “double” an artifact -- say, one note getting a treble boost not noticeable until you compare to the other codec which has a treble cut on the same note; say +D to 0 is inaudible, -d to 0 is inaudible, +D to -d is.

You're absolutely right; I was wrong. Changes in EQ, level and stereo balance all fit this latter situation and are perfectly good candidates as ABX stimuli.  I'm not sure I can come up with real-world examples that would fall under the former, though someone else might.

I have "Golden Ears" Which AAC VBR Bitrate is acceptable?

Reply #17
Being able to hear differences between lossless and lossy means that your hearing (and/or training) varies from the model used in compression.  Your ears could be gold or tin!

Hypothetical example:  the model assumes that a cannon shot in your left ear will mask a pin drop in your right ear.  IF you are stone cold deaf in your left ear, then you will hear whether the pin drop is there, missing, or echoing.  A person with full hearing wouldn't.