IPB

Welcome Guest ( Log In | Register )

What differences among CUETools outputs? libFlake/libFLAC/flake/FLACCL, Moderationóderailed into a verbose discussion about compression levels
pinkfloydeffect
post Mar 24 2012, 23:41
Post #1





Group: Members
Posts: 17
Joined: 2-March 12
Member No.: 97516



I have some audio FLAC CDs in single track format with a CUE file. I read the best way to split up the tracks is with CUETools, but I don't know which audio output (encoder?) to use. There are four options:

libFlake, libFLAC, flake & FLACCL

Can anyone explain what the difference(s) are please?

Thanks!
Go to the top of the page
+Quote Post
 
Start new topic
Replies
Chinch
post Mar 27 2012, 11:08
Post #2





Group: Members
Posts: 98
Joined: 22-July 09
Member No.: 71664



Exactly. So here is to sum up what everyone has said about your questions:

1. Compression has nothing to do with audio quality in a FLAC file or any lossless file format like it does with a lossy format such as MP3/AAC (aka M4A). Whether you use level 0, or level 8... the output (whether decoded in real-time as you are listening to the music, or decoding it back to a WAV file, etc) will be exactly the same. Compression ONLY has to do with two factors... a) the amount of hard drive space the encoded FLAC file will take up (if you would like to save as much HD space as possible, then always choose level 8) and b) how much resources will be consumed, most often considered are the playback/real-time resource usage, since the greater the compression of the FLAC file, the more CPU/memory it is likely to use when you are PLAYING that file. When you are decoding to a WAV file... it's likely your hard drive is the bottleneck rather than the CPU. Same goes for real-time decoding -- if your system has trouble decoding audio to the point where it can't keep up with decoding 1 (listening) second of audio before your audio player is ready to play that second or those fractions of a second, then it's time to upgrade. If your machine is fast like you said, then I personally recommend using compression level 8, the max in the reference or "official" FLAC encoder.

2. Once again, they mentioned that FLAC is a lossless format... and you're not going to do anything to make it LOSSY unless you're converting from a LOSSY SOURCE such as MP3's or AAC files. As long as you're encoding from another lossless format, such as Apple Lossless (ALAC) or WAV... which is the uncompressed form of the lossless audio, you will suffer no quality loss. Even decoding a FLAC file to a WAV file, then back to a FLAC file with greater compression will not cause audio loss.

There is a super simple way to prove this. If you know how to create a checksum or a checksum file (such as a SFV, MD5 or SHA-1 file/file hash) then grab a WAV file, get make a checksum of it, then encode the WAV to FLAC using any compression level you want. Now delete the original WAV file. Decode the FLAC file right back to WAV (thus re-creating the original WAV file... which should be identical. Check the newly decoded WAV file with the original WAV file's checksum value. It should match. I'm not going to go into super detail about anything that could cause them not to match but still be the same audio data... because they're rare cases, and they're the exception rather than the rule. Also, each FLAC file, once encoded, has a built-in MD5 "fingerprint" or checksum value. This is calculated considering ONLY the decoded/original audio stream, and EXCLUDES metadata (tags, embedded art, etc) or any other non-audio data. Therefore, you can change the file's tags, etc and even re-encode it, and this MD5 should remain the same.

So here's my proof process. I open up foobar2000, load up a random FLAC file. Then I go to the file's properties, and take a screenshot of it. A few very important values to note. 1) filesize 2) total length (or more importantly, total SAMPLES) 3) MD5 checksum of the audio

Before screenshot, I have no idea what compression was used on this file, etc... it doesn't matter. This is the file's properties, still in FLAC format, the way I found it:



Then, I decode the FLAC file to a WAV file. Next, I arbitrarily choose a compression level of "2"... because it's... well... random enough... then I re-encode the WAV file with the new compression level. Take a look at the end results in the "after" screenshot:



You will notice three important things... 1) the second (re-encoded) file size is LARGER than the original one.. meaning it was originally compressed using greater than level 2 compression, since it's approximately 2MB bigger... BUT at the same time, you'll notice that it is the EXACT same length (both time and samples are identical)... AND the MD5 checksum of the audio stream is identical, even though they have two differing file sizes! This proves that the audio has not changed by a single sample; thousands of a second... furthermore... I used foobar2000's Binary Comparator (I can't recall if it is stock or an add-on, sorry)... but this compares the exact same thing... it disregards any NON-AUDIO data, and compares ONLY the audio within both files. I compared the original FLAC file, with the re-encoded (larger) FLAC output file, with the same results... identical files, no differences found:

All tracks decoded fine, no differences found.
Comparing:
FILE1.FLAC
FILE2.FLAC
No differences in decoded data found.



3. My personal recommendation, and this is only because I have a fast machine with plenty of memory, and ENCODING speed (the very first compression) nor the DECODING speed (decompressing the samples as it plays the audio, plus a little ahead of what you're listening to, which is buffered...) is an issue to me. Therefore, when encoding, I always use two parameters. Those are -8 for maximum compression (I would rather the encoding take a little bit longer and save a little bit more hard drive space than vice versa), since I already run a tight ship there, haha... and although some may argue against its worth; the fact of the matter is that it can't do any HARM (other than consume a few more CPU cycles) is the -V or VERIFY switch/parameter... which encodes a bit of audio, then immediately turns back around and decodes it... then compares it to the source file (either in memory still or on disk) -- if everything jives, then you know there was no type of corruption or errors during the encoding process. It's one of those "paranoid" things that OCD freaks like me do... haha...

4. To me, another important thing... and this is, again, purely preference and will be subjective... but I prefer to rip my CD's in EAC using the COPY AND TEST IMAGE + CUE FILE option. Either option you use, by proper ripping standards, you have GOT to make sure that you are using the proper offset for your specific CD drive, which should have been auto-configured for you by EAC upon first use. This may cause cutoff at the start of a track; most obviously not at the very start of the CD or end... but in the MIDDLE of two seamless/gapless tracks where one goes right into the other with no pause whatsoever... this would be if you were doing track-by-track and reading each track by its own offset. If you have a large offset margin, and it happens between two seamless tracks, you're not going to notice the missed audio-- but it COULD reveal itself as a POP or CLICK in the transition... (which is caused by two points of a waveform not aligning to a zero-boundary, and the sudden shift between two differing dB/amplitude levels caused by misalignment... in this case, missing samples)... If you were ripping the disc as a SINGLE IMAGE... you wouldn't experience this, your whole image would just be shifted <- or -> whatever the missing offset correction was. Since there's padding before and after the first/last tracks, then an offset that is so tiny will only likely ever matter when comparing to the AccurateRip database... MAYBE.

For instance, my drive's offset is +6 samples. A standard audio CD plays at 44,100 samples per second (1 second = 1 frequency cycle, aka "period" = 1Hz). In my case, that 6 sample offset error is, in reality, equal to a fractional 1/7,350ths of 1 second, reduced... if these missing samples occur during an already silent/null/zero'd area of the disc/audio, such as the very first or very last samples of the data.


This is probably far more information than you were looking for, but maybe it'll be helpful to someone else... there are tons of very smart people reading these forums, and they are very curious to learn the fine details, most of them... so I don't know if there's always such a thing as "too detailed" or explaining something to greater levels than the original question may have asked for. I know, myself, I will ask a question that isn't asking for much detail... but simply because I am not AWARE of what details there are to such a subject. In other words, sometimes I don't know the correct questions to ask, if I don't know the depth of topic, right?

I can't begin to ask someone about low-level functionality of say, a video card... if I don't know how the bus handles the data to form/renders images, etc! You read as far as you can understand.. and there'll be a point where it stops making full sense usually... but you've learned a little more than you knew before... and the next thread that goes into detail, you'll understand more of it than you would have.... blah blah. It's all progressive learning, you learn this crap in tiers, (and if you learn it from me, likely in tears)... I'll end my formal write-up there. I'm sure it's more than you ever cared to know... but-- if you do want to know more... that's what these people are all here for... to learn and to teach... ass-hoppah!

NO difference between compress 1 and compress 8 for quality! GO! flaCUDA uses video card GPU on nVIDIA-based cards to speed up encoding. It proprietary though! Not for ATI! Flake and other ... experimental re-codings of original FLAC encoder/decoder. Done for fun/hobby. One by CUETOOLS creator... maybe both. Could be bugs, non-standard problems. I would stick with reference encoder, always. Speed is nice, but encoding glitches and errors, NOT nice... fu.. whoops. Almost drop F bomb there LOL... they mess up audio possibly!!@! This audio, not identical to original! Big problem!!! Hope you enjoy learning from my offerings. If not, my apologies to all. Have fantastic days today, everyone.

P.S. Two very minute bits of info I omitted...

When I rip the audio CD's to a single WAV file, I then turn around and encode that single WAV file to a single FLAC file, and embed the CUE sheet into the FLAC tag. This provides all of the correct indices for track identification (as well as lead-in's or INDEX 00's)... it includes essentially the full original disc's exact layout, within that 1 file... as well as the album art, titles, etc. In essence, I have 1 file per album that contains everything embedded within it to decode it and write back a 100% perfect copy of the original. This is my preference. Most people just haphazardly rip to 1-FLAC-per-TRACK, and don't ever save the CUE sheet or the LOG file from their rip/encode, as I've seen out in the wild. That sucks for me because the funny part is, when I'm looking for these albums, I'm not interested in the audio files... I already have them. I often want to get a different copy of the CUE sheet or log file from someone else's rip to address some rare anomaly or double check something that seems odd to me... and they never include the things. I may be one of the few people who are disappointed that they included audio files.... I just want the text files to compare! Regardless, this way... TO ME... IMO... is the ideal way to package my digital copies of my CD's. As said, though... it's all preference... so no fights necessary about which is "best".

Second, as far as compression ratios go... just for your reference/curiosity... most album WAV files that I compress at level 8 (max) hit an average of about a 30% reduction in file size vs. the original. So most of them end up being just 70% the size of their original, uncompressed sources... with NO LOSS of audio, whatsoever. You've gotta love lossless compression... it's one of the few scenarios where you can literally get "something for nothing"... (but no chicks for free... sorry). In fact, if chicks hear you talking about any of this stuff, you won't be getting chicks for free, OR with a major credit card, even... unfortunately discussing the dynamics of lossless audio compression algorithms does NOT get them hot and bothered. It just gets them bothered. LOL. We all know that already, though. wink.gif



This post has been edited by Chinch: Mar 27 2012, 11:32
Go to the top of the page
+Quote Post
Porcus
post Mar 27 2012, 11:31
Post #3





Group: Members
Posts: 1849
Joined: 30-November 06
Member No.: 38207



QUOTE (Chinch @ Mar 27 2012, 11:08) *
the greater the compression of the FLAC file, the more CPU/memory it is likely to use when you are PLAYING that file.


Nope, FLAC decodes at practically the same speed regardless of compression level (and it is very fast). If you look at http://flac.sourceforge.net/comparison_all_procdectime.html , the differences are miniscule, and it isn't even so that better compression takes longer time.

TAK and WavPack are fairly close to the same behaviour.


As for checking lossiness, it is easier to use foobar2000's bit comparator than md5ing files (which could fail due to headers and tags and whatnot ... actually, fb2k's bit comparator might at worst give miniscule differences due to roundoff errors on mp3 files, but you will usually not use it for those cases anyway).


QUOTE
-V or VERIFY switch/parameter... which encodes a bit of audio, then immediately turns back around and decodes it... then compares it to the source file (either in memory still or on disk)

I agree on that.


--------------------
One day in the Year of the Fox came a time remembered well
Go to the top of the page
+Quote Post
Chinch
post Mar 27 2012, 17:40
Post #4





Group: Members
Posts: 98
Joined: 22-July 09
Member No.: 71664



QUOTE (Porcus @ Mar 27 2012, 06:31) *
QUOTE (Chinch @ Mar 27 2012, 11:08) *
the greater the compression of the FLAC file, the more CPU/memory it is likely to use when you are PLAYING that file.

Nope, FLAC decodes at practically the same speed regardless of compression level (and it is very fast). If you look at http://flac.sourceforge.net/comparison_all_procdectime.html , the differences are miniscule, and it isn't even so that better compression takes longer time.

I'll have to look at your charts a bit later, but to say such a thing would make one ask "if it isn't even so that better compression takes a longer time", then why would the option even exist to compress at the lower levels? If they were identical, it'd be level 8 and that's it... the others would be superfluous. Not to mention it breaks some of the basic accepted laws of computer science and mathematics, such as the time-memory tradeoff-- which this is, essentially. You're trading encoding time for less memory usage (storage memory, specifically, hard drive space; not random access memory). Or the opposite scenario, you get a quicker result, but the tradeoff is more memory/storage space is used. I don't make this up, here's some reading... and it's one of the fundamental concepts behind the creation of rainbow tables and why they are so effective.
See: http://en.wikipedia.org/wiki/Space%E2%80%93time_tradeoff

QUOTE (Porcus @ Mar 27 2012, 06:31) *
As for checking lossiness, it is easier to use foobar2000's bit comparator than md5ing files (which could fail due to headers and tags and whatnot ... actually, fb2k's bit comparator might at worst give miniscule differences due to roundoff errors on mp3 files, but you will usually not use it for those cases anyway).

You must've not read through my post fully, here's a direct quote from it:

QUOTE
I used foobar2000's Binary Comparator (I can't recall if it is stock or an add-on, sorry)... but this compares the exact same thing... it disregards any NON-AUDIO data, and compares ONLY the audio within both files. I compared the original FLAC file, with the re-encoded (larger) FLAC output file, with the same results... identical files, no differences found:

All tracks decoded fine, no differences found.
Comparing:
FILE1.FLAC
FILE2.FLAC
No differences in decoded data found.

Also the internal FLAC MD5 files are reliable as comparisons, since they compare the UNCOMPRESSED audio stream in the FLAC file, disregarding all "headers and tags". It strictly uses uncompressed audio to calculate. This is stored in the STREAMINFO metadata block, which is required when creating a FLAC format file.

The quote directly from FLAC's developer pages: "Also included in the STREAMINFO block is the MD5 signature of the unencoded audio data. This is useful for checking an entire stream for transmission errors."
See: http://flac.sourceforge.net/documentation_...t_overview.html under the first header section titled "METADATA".

Don't get the two topics confused-- I do not mean an MD5 created of the entire FLAC file. Simply changing a tag or some other tiny bit of information; any number of things will ruin that MD5 compare. I'm strictly referring to the *internal* MD5 checksum that is auto-generated by the encoder and stored in the STREAMINFO block of the file. You need not do a thing to create this other than to encode to FLAC. This will continue to match across compression levels, if tags are changed, etc -- because, again, it strictly uses the AUDIO STREAM data (the raw audio information, uncompressed) to create this. Thus why in my screenshots, you can see an unchanged MD5 checksum across two FLAC files using the same source WAV, but encoded/compressed at two different levels, even the results are two different file sizes to show the different compression levels used. Read the post again and you'll pick up a lot of this detail; I did mention it.

This post has been edited by db1989: Mar 27 2012, 22:19
Reason for edit: Use [quote] tags properly rather than just lumping your reply in with another user’s post.
Go to the top of the page
+Quote Post
Porcus
post Mar 27 2012, 20:31
Post #5





Group: Members
Posts: 1849
Joined: 30-November 06
Member No.: 38207



QUOTE (Chinch @ Mar 27 2012, 18:40) *
I'll have to look at your charts a bit later, but to say such a thing would make one ask "if it isn't even so that better compression takes a longer time", then why would the option even exist


Sorry. To be clearer:
- better compression (smaller file) takes longer time to encode, yes.
- ... but not to decode. Not with FLAC, and not much with TAK or WavPack either (you might very well be constrained on drive speed on those).

(For the likes of Monkey's and OptimFrog, then there is a price for better compression payable every time you decode.)


--------------------
One day in the Year of the Fox came a time remembered well
Go to the top of the page
+Quote Post
Chinch
post Mar 27 2012, 21:26
Post #6





Group: Members
Posts: 98
Joined: 22-July 09
Member No.: 71664



I see, this is can meet more with you on. Definitely going to be more taxing encoding than decoding, surely... and this is going to be a subjective thing, as well... depending on each individual's CPU/memory/hard drive speed, etc. Let me run a quick test. It's going to be a rough one, but an estimated one. I'm curious myself...

Yow. That was so fast it was literally negligible.


Test source (the only CD I could think off immediately that actually fills the entire CD? TOOL - Lateralus).
Decode FLAC -> WAV took 16.1 seconds (Note: Original FLAC compression was LEVEL 8; I know, because I did it)
Decoded WAV is 01h:18m:56s, uncompressed file size is 796MB, single WAV file



You know what? I already know this is going to be a problem and it is going to demonstrate something important.

I was encoding back to FLAC, and noticed 0 and 8 were taking about the same amount of time. Now, I have a good enough machine to know that it's not being utilized to its potential, by far.

I watched the flac.exe encoder task run... and the CPU load never went above 15%. Now, I have an i7 quad-core with HT, so it shows up as 8 cores in task manager/apps. Then I realized that this reference build of FLAC encoder is dated 09/17/2007! That's a 5 year old compile! Then I realized more... of course it's not going to be multi-threaded, let alone optimized to utilize multiple CPU cores! I ran resource manager and highlighted the FLAC process... watched it... and the final results of the process at termination tell me all I really need to know.

Threads: 1, 32-bit app (of course)... it's likely not even compiled with simple optimized instruction extension sets like MMX/SSE/SSE2, etc...

In other words... the encoder is self-limiting. It's not multi-threaded, it's going to run on what would be seen as "half a core" out of 4 physical cores... and basically... that's as good as it's gonna get. The CPU limitation was actually the bottleneck (which is rarely the case in semi-current applications) vs. the HD. So it doesn't matter if I were to run this on a 12-core thread shredding, number crunching monster CPU... I'm gonna end up with the same encode time as I would with say... a piece of crap single core CPU like a Pentium.... lol... whatever came before the Core 2 Duo, I can't even remember the names... Pentium... Pentium Pro? That was like generation 5 of the x86 architecture chips (pent=5)... the Pro was generation 6 I think? Then the dual core chips could have been 7th gen? I'm making stuff up as I go at this point but case already solved and proven that the encode time will be no faster on a Dual core vs. this Quad core... or even on a single core vs. a dual core at the same speed.

This only gives more backup to what I said earlier -- if your computer can't handle realtime decoding of a FLAC file at any compression... it's time for you to upgrade your gear... cause you're hurtin. Conclusion: Not an issue.

And if this is the case, no wonder the encoding/decoding statistics gathered would not vary much from setting to setting! They're using the same amount of resources no matter what you set the compression to. So crank it up to 8, having said that. One of the test encoders that was starting to try and head that direction with the encoder, now that I am starting to recall... Flake, maybe it was? One of them had a benchmarking functionality built in or packaged with it to compare the encode/decode times and CPU demand amongst different compression settings, or maybe that encoder vs. the reference one.

All in all, the actual DECODE of the 800MB (nearly) CD was so fast that it finshed before I hit START on my stopwatch, LOL. So ONE SECOND or less to decode a LEVEL 8 compressed, FULL CD.

I haven't dabbled in programming to a deeper level in years, so I have no idea if there is any resolve to just an old, single threaded app... to make it .... it seems like I've heard of thread-slicing... I know I have... and while I don't know what the exact meaning is, if I had to infer... it would seem that it could take a single thread and distribute the workload across multiple cores (load-balance an otherwise single CPU focused application). Or, besides a recompile of the code with a newer compiler which would include the newest instruction sets for CPU's... etc... (which won't happen, at least officially) -- but alas, I forgot... FLAC is open source! Duh. Now I'm interested in looking at the code. (I was a programmer for years, but then again... I haven't programmed in years.... so I'm without a doubt rusty and not up to par with current technologies and practices)...

I'll come back to this... Let me do some more looking, some more researching... take a look at the source code. I'm not re-inventing the wheel... as it's been recoded as a multi-threaded encoder, and even in flaCUDA, it employs the very powerful GPU to make the encoder super efficient.

This post has been edited by db1989: Mar 27 2012, 22:19
Reason for edit: deleting pointless full quote of above post
Go to the top of the page
+Quote Post
Porcus
post Mar 27 2012, 22:40
Post #7





Group: Members
Posts: 1849
Joined: 30-November 06
Member No.: 38207



QUOTE (Chinch @ Mar 27 2012, 22:26) *
I was encoding back to FLAC, and noticed 0 and 8 were taking about the same amount of time. Now, I have a good enough machine to know that it's not being utilized to its potential, by far.

I watched the flac.exe encoder task run... and the CPU load never went above 15%. Now, I have an i7 quad-core with HT, so it shows up as 8 cores in task manager/apps. Then I realized that this reference build of FLAC encoder is dated 09/17/2007! That's a 5 year old compile! Then I realized more... of course it's not going to be multi-threaded, let alone optimized to utilize multiple CPU cores!


Since you are a foobar2000 user, I am making the guess that you were converting from there.

If you have a look at Preferences --> Advanced --> Tools --> Converter, you will see an option to use multiple threads, and you will see an option to assign priority to conversion.

I believe the default values are as on my setup: autodetect # of threads, and set priority to 2, on a scale from 1 to 7. I guess that explains the 15 percent figure. Try to set priority to 7 and redo the experiment. (Maybe it even helps to set fb2k at high or realtime priority first.)

As far as I know, you are right that FLAC.exe itself does not support multithreading. However, fb2k will simply start one for each number of threads, although no more than the number of files to convert. That means that big jobs are about equally efficiently done as if FLAC itself had supported multithreading.


QUOTE (Chinch @ Mar 27 2012, 22:26) *
All in all, the actual DECODE of the 800MB (nearly) CD was so fast that it finshed before I hit START on my stopwatch, LOL.


fb2k has a decoding speed test: http://wiki.hydrogenaudio.org/index.php?ti...(foo_benchmark)
You can choose whether to buffer the files in RAM in order not to be constrained by drive speed. My few-years old laptop with a Turion TL-56 (dualcore at 1.8 MHz), decodes from RAM at slightly less than 200x realtime, meaning that the actual playback is already down to half a percent of max CPU load. And the differences between FLAC -8, FLAC -0, TAK and WavPack are quite uninteresting ...


--------------------
One day in the Year of the Fox came a time remembered well
Go to the top of the page
+Quote Post
Chinch
post Mar 28 2012, 04:56
Post #8





Group: Members
Posts: 98
Joined: 22-July 09
Member No.: 71664



Since you are a foobar2000 user, I am making the guess that you were converting from there.

Actually, when I encode from single WAV to FLAC, I usually do it in batch... and I simply use the Flac Frontend (the basic, generic one)... but even so, it still doesn't matter, as I just watched it again as I encoded from WAV to FLAC... and it took just as long as the other. Viewing the Resource Monitor, I saw FLAC.exe pop up once it started, and guess how many threads it was using? 1. The whole multi-thread option you're looking at will not bypass this limitation. What it *is* useful for is encoding multiple seperate tracks *in tandem* or in parallel, however you want to put it. So if I were encoding 10 WAV files to 10 FLAC files, then it'd likely fire off 4 to 8 copies of FLAC.exe, each core or HT handling its own track/thread. In doing this, the limitation can be surpassed, by simply executing several copies of the program at once and letting them run at the same time. That's what the threading is for there, I would have to assume.

If you have a look at Preferences --> Advanced --> Tools --> Converter, you will see an option to use multiple threads, and you will see an option to assign priority to conversion.

I actually already tried this earlier by using the START command, which you can use a priority switch to fire off a process... say... START /HIGH flac.exe blah.wav -8 or whatever. So I've already recreated that, and the priority doesn't affect the encode for any instances I have done.

I believe the default values are as on my setup: autodetect # of threads, and set priority to 2, on a scale from 1 to 7. I guess that explains the 15 percent figure. Try to set priority to 7 and redo the experiment. (Maybe it even helps to set fb2k at high or realtime priority first.)

Doesn't matter the priority of foobar.exe, since it opens the external FLAC.exe to do the encoding job. As I mentioned above, I have actually SET the priority via command line switches when starting up the actual FLAC.exe process, with no effect. You can look into the "start" command if you'd like, and its parameters... you will see the ability to set processsor affinity in there, but don't get confused, that is a RESTRICTION not an upgrade or whatever... you are telling it to run on "this" processor core ONLY.

As far as I know, you are right that FLAC.exe itself does not support multithreading. However, fb2k will simply start one for each number of threads, although no more than the number of files to convert. That means that big jobs are about equally efficiently done as if FLAC itself had supported multithreading.

I can finally agree with you here... I can empirically prove that it's not a multi-threaded process (flac.exe)... it's an old 32-bit DOS/command prompt/console based application, that like I was saying earlier... was compiled in 2007! A loooot has changed, hardware-wise and otherwise since 2007. It's definitely not optimally compiled for today's architectures.


fb2k has a decoding speed test: http://wiki.hydrogenaudio.org/index.php?ti...(foo_benchmark)
You can choose whether to buffer the files in RAM in order not to be constrained by drive speed. My few-years old laptop with a Turion TL-56 (dualcore at 1.8 MHz), decodes from RAM at slightly less than 200x realtime, meaning that the actual playback is already down to half a percent of max CPU load. And the differences between FLAC -8, FLAC -0, TAK and WavPack are quite uninteresting ...


Yep, I already have this one. I've been using foobar for years and years... so any of the plugins like that and the Binary Comparator (which I use a lot) I'm very familiar with. Buffering to memory isn't that big of a deal to me, I run a 4 drive RAID 10 array, with Western Digital Black model drives (which smoke even as a single drive)... Just for your interest, here is the result I have:

CODE
Total length: 1:18:56.600
Info Read time: 0:00.000
Opening time: 0:00.313
Decoding time: 0:10.564
448.367x realtime


That's still using TOOL - Lateralus... which as you see is 01hr:18m:56s -- why can't other bands fill up a CD like TOOL always manages to? I get tired of paying so much for 25-30 minute albums, it's garbage. They used to call that an "EP"... now they call them "LP"s and then wonder why sales of their $22 1/2 an-hour CD are low. Write more than 3-5 songs... make them more than the traditional 2 minute punk track... then re-test the waters.

But uhm... yeah the speed difference is fractions of a second difference whether I buffer it to memory or not, so it's not important here... but I am running an insane amount of apps right now, to include THREE different browsers, (this one has about 25-30 tabs open, just itself)... plus an audio track I've been mixing in Audacity that's like 20 tracks thick because I'm in the midst of editing and bouncing, chopping and joining samples and clips.

Yeah, task manager says 166 processes running, and it doesn't count the individual tabs in Firefox and IE9 (though it does for Chrome, which I'm using now)... and I'm currently using 9GB of memory. Running 64-bit OS, obviously...

Ah yes, and I also know all of the technicals of ripping and ripping properly.... if that's in question at any point, I'll be pro-active:

CODE
Track  Status
01     Accurately ripped, confidence: 200.
02     Accurately ripped, confidence: 200.
03     Accurately ripped, confidence: 200.
04     Accurately ripped, confidence: 200.
05     Accurately ripped, confidence: 200.
06     Accurately ripped, confidence: 200.
07     Accurately ripped, confidence: 200.
08     Accurately ripped, confidence: 200.
09     Accurately ripped, confidence: 200.
10     Accurately ripped, confidence: 200.
11     Accurately ripped, confidence: 200.
12     Accurately ripped, confidence: 200.
13     Accurately ripped, confidence: 200.


I'd say 200 matches in the AccurateRip database is sufficient enough to call it "dependable". (I believe 200 is the max matches ARv1 will return without setting it to verbose mode?)

And trust, my tagging is complete and I take great care in maintaining my digital audio collection. I have personally ripped all of my discs, properly... and tagged them with nearly every detail you can imagine, as you see here...



Got the ISRC's, CATALOG #, CDDB ID (DISCID), this disc is actually detected as an HDCD... which is a good thing, because it *is* an HDCD... smile.gif and... oh yeah as I said, I do EAC->IMAGE+CUE+LOG... CUE is embedded and artwork is embedded as well. Lyrics are in there... actually... I have the LYRICS tag used... which of course means yes, I have synchronized lyrics in the puppy even... I know it's not showing the "real" cover of the album. I put that one there, so kiss it... If you have the CD you'll know the "cover" is clear cellophane type material, in layers... which is a PITA to scan and then have to fix the scan colors and artifacts, etc... (my scanner is a older flatbed and sucks pretty much... it works though). It can scan things up to the resolution of my HOUSE, LOL... like WTF is that?!? Oh... that's a pixel. Taking up my entire 1080p display. Just... 1 pixel? Damn. I think I set the resolution up a bit too much cause the JPEG saved at 10% quality is 900MB. Now it's just one gigantic, house-sized 900MB DISCOLORED pixel. Ahhhh... hyperbole. I love it. Do me a favor though, after all of my efforts, Porcus... can you put your trust and faith in the idea that I actually do know what I'm talking about and doing, and I'm not a n00b? wink.gif

It seems like... though mostly you're carrying on a discussion with me about these subjects... I've gotten the feeling that since your first response to me, you've been trying to prove me wrong or catch me making some stuff up, haha... like every time I counter-evidence your doubts, you come right back with more things for me to try and prove! I ain't mad'atcha... It's all cool, I just was being honest with you about how it seemed from this side of the fence.

I had a good time spankin' ya though on each comeback you had for me, LOL. It's all in good fun... but I think that's the last time I will be able to explain myself for tonight, my brain is about to shut down... need rest! So does this forum, I think. I have laid some textual abuse down on it today. Once you get me blabbing about this stuff, it's tough for me to STFU... but I'll zip it right here. Talk to you later, my friend... biggrin.gif

Go to the top of the page
+Quote Post

Posts in this topic
- pinkfloydeffect   What differences among CUETools outputs? libFlake/libFLAC/flake/FLACCL   Mar 24 2012, 23:41
- - pablogm123   Check these links: www.cuetools.net/wiki/CUETools...   Mar 25 2012, 02:19
- - Porcus   QUOTE (pinkfloydeffect @ Mar 24 2012, 23...   Mar 25 2012, 02:25
- - pinkfloydeffect   Thanks guys! I'm not worried about speed I...   Mar 25 2012, 02:40
|- - Kohlrabi   QUOTE (pinkfloydeffect @ Mar 25 2012, 02...   Mar 25 2012, 03:12
|- - pinkfloydeffect   Wait...so my FLAC audio player extracts the song f...   Mar 25 2012, 04:06
|- - GenjuroXL   QUOTE (pinkfloydeffect @ Mar 25 2012, 05...   Mar 25 2012, 09:20
|- - lvqcl   QUOTE (pinkfloydeffect @ Mar 25 2012, 07...   Mar 25 2012, 10:34
- - pablogm123   I own a very ancient PC (A secondary retro PC I bu...   Mar 25 2012, 10:55
- - pinkfloydeffect   I see I see, is this why when I open FLAC files in...   Mar 25 2012, 17:55
|- - Dario   QUOTE (pinkfloydeffect @ Mar 25 2012, 18...   Mar 25 2012, 22:38
|- - pinkfloydeffect   Thanks! So either way it's not about quali...   Mar 25 2012, 22:50
|- - Porcus   QUOTE (pinkfloydeffect @ Mar 25 2012, 22...   Mar 26 2012, 07:24
|- - pinkfloydeffect   Thank You   Mar 27 2012, 04:57
- - pinkfloydeffect   ALSO, the bar above "Go' that says 0-8 is...   Mar 25 2012, 22:28
- - tpijag   once again, lossless is lossless. Before getting t...   Mar 25 2012, 22:57
- - pablogm123   QUOTE "and foobar seems to "fade" t...   Mar 25 2012, 23:17
|- - pinkfloydeffect   It seems to run pretty fast in level 8 for me, I h...   Mar 25 2012, 23:35
- - pablogm123   You can get better perfomance if you use a convert...   Mar 25 2012, 23:47
- - Chinch   Exactly. So here is to sum up what everyone has sa...   Mar 27 2012, 11:08
|- - Porcus   QUOTE (Chinch @ Mar 27 2012, 11:08) the g...   Mar 27 2012, 11:31
|- - Chinch   QUOTE (Porcus @ Mar 27 2012, 06:31) QUOTE...   Mar 27 2012, 17:40
|- - Porcus   QUOTE (Chinch @ Mar 27 2012, 18:40) I...   Mar 27 2012, 20:31
|- - Chinch   I see, this is can meet more with you on. Definite...   Mar 27 2012, 21:26
|- - Porcus   QUOTE (Chinch @ Mar 27 2012, 22:26) I was...   Mar 27 2012, 22:40
|- - Chinch   Since you are a foobar2000 user, I am making the g...   Mar 28 2012, 04:56
- - pinkfloydeffect   Wow those are some lengthy detailed posts there Ch...   Mar 28 2012, 03:38
- - Porcus   Re your concern about a newer FLAC build; I knew I...   Mar 29 2012, 08:24
|- - pinkfloydeffect   QUOTE (Porcus @ Mar 29 2012, 03:24) Re yo...   Mar 29 2012, 23:23
|- - Porcus   QUOTE (pinkfloydeffect @ Mar 30 2012, 00...   Mar 31 2012, 14:28
- - Elias Pies de Plomo   Hello guys, I've been reading your posts as I ...   May 4 2013, 16:27
|- - belli   QUOTE (Elias Pies de Plomo @ May 4 2013, 16...   May 7 2013, 07:05
- - Wombat   QUOTE (belli @ May 7 2013, 08:05) QUOTE (...   May 7 2013, 15:09
|- - belli   QUOTE (Wombat @ May 7 2013, 16:09) QUOTE ...   May 7 2013, 21:21
|- - Gregory S. Chudov   QUOTE (belli @ May 7 2013, 16:21) Flake, ...   May 9 2013, 04:06
- - Wombat   QUOTE (belli @ May 7 2013, 22:21) Flake, ...   May 7 2013, 21:36
|- - belli   QUOTE (Wombat @ May 7 2013, 22:36) QUOTE ...   May 8 2013, 08:17
- - Wombat   I also tried latest 2.1.5 because i am still with ...   May 9 2013, 15:19
- - Gregory S. Chudov   I think the difference in size is just padding. CU...   May 9 2013, 16:47


Reply to this topicStart new topic
2 User(s) are reading this topic (2 Guests and 0 Anonymous Users)
0 Members:

 



RSS Lo-Fi Version Time is now: 2nd October 2014 - 17:13