Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: AccurateRip - Future Direction (Read 49958 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

AccurateRip - Future Direction

Reply #25
How would the system know if it is a good submission?


If the submission is from either dBpoweramp or EAC and ripped in secure mode with secure results, especially if the whole disc is ripped and secure. This should be less of an issue with AR2 though since it looks like we will be able to cross-check somewhat between pressings.

AccurateRip - Future Direction

Reply #26
Using EAC for an example, if the report to the AR database contained information eac comparison results. If copy and test CRC's matched and the AR CRC is different between the submissions, you could replace the origional submission with the CRC matched results.
Glass half full!


AccurateRip - Future Direction

Reply #28
T&C does not account for consistent errors

But it does account for inconsistent errors, whereas the original submission may not take into account the possibility for any errors.  Remember, Bilbo is talking about replacing a submission made with the same user ID.

However, I find such an implementation unnecessarily cumbersome.  I have faith that someone else will either come along and add a confidence to the rip or provide a new checksum if the original one wasn't correct and eventually someone else will bump out the bad result with a submission that matches.

AR is great because of agreeing submissions by multiple users, not by submissions by single users even if they are "secure".

AccurateRip - Future Direction

Reply #29
I am just concerned with the confidence levels. Say if 52 rip a disk and submit the results.
two used secure mode (and their results match), and 50 used burst mode. All burst modes failed AR checksum. These 50 people re-rip in secure mode and the resulting CRC's all match the first two. Under the current system, if another person checks a rip, he will only get a confidence level of 2, wheras it should really be 52, because the good second submissions were rejected.

Another option that would help, would be to allow the block submission of the first results, if they are going to re-rip.
Glass half full!

AccurateRip - Future Direction

Reply #30
<quoting myself for the hundredth time>
A confidence of 2 is good enough in my book, a confidence of 1 is fine with me as well so long as it wasn't my submission.
</quoting myself for the hundredth time>

FWIW, when I submit results, I do so manually.  I only send the results that I want to send.  The rest are blown away prior to ripping a title I plan on submitting.  Personally, it doesn't matter if others aren't as picky.  Out of submissions from 50 individual users with the same pressing at least one of them is bound to be accurate, even if it was done in burst mode.  I think this is more realistic than saying all 50 people who get bad rips in burst mode are going to get good rips by virtue of switching to a secure mode.

AccurateRip - Future Direction

Reply #31
Anyhow, as far as checksums go, I think CRC32 is fine and if you're worried that people will be concerned that the checksums for the first and last track don't match the one given by your ripper, just NOT them.

AccurateRip - Future Direction

Reply #32
I am just concerned with the confidence levels. Say if 52 rip a disk and submit the results.
two used secure mode (and their results match), and 50 used burst mode. All burst modes failed AR checksum. These 50 people re-rip in secure mode and the resulting CRC's all match the first two. Under the current system, if another person checks a rip, he will only get a confidence level of 2, wheras it should really be 52, because the good second submissions were rejected.

Another option that would help, would be to allow the block submission of the first results, if they are going to re-rip.


I may be wrong, but I think if you re-rip and re-submit it would be added to the database. If it matches, it will add to the match count. If it does not match any previous rips it is added to the offline database, or if there are no previous submissions for that disc (which in this case there are) it would be added to the main database.

What I have suggested, is that if a rip is accurate but does not match any AR submissions, especially if this is the case for the entire disc, then it should be added to the Active AR Database that is used for matching even if it is only 1 submission. But it seems as though this may not be an issue as we will be able to cross check between pressings.

AccurateRip - Future Direction

Reply #33
@Eli
I brought this up because op Spoon's statement earlier in the thread:

"Yes each submission has a record, resubmissions are dropped, not replaced."
Glass half full!

AccurateRip - Future Direction

Reply #34
@Eli
I brought this up because op Spoon's statement earlier in the thread:

"Yes each submission has a record, resubmissions are dropped, not replaced."


Thanks, I did not notice that. I agree, its not a good idea. Especially since re-rips are probably more likely to be accurate as additional measures have probably been taken to correct bad rips.

AccurateRip - Future Direction

Reply #35
Especially since re-rips are probably more likely to be accurate as additional measures have probably been taken to correct bad rips.
That's an excellent point.

I say replace the old submission unless the old one has been verified with a submission from a different user.

I don't think the answer lies in submitting T&C data or information about the ripping configuration.

AccurateRip - Future Direction

Reply #36
MD5!  Widely used, 128-bit hash value...

Used extensively in the field of computer/digital forensics.  Certainly would benefit dBpowerAmp, IMHO. 
"He's a natural born world-shaker."  -Dragline

AccurateRip - Future Direction

Reply #37
Hi,

I already posted that on the dBpoweramp forums, but the discussion seems dead on that side...

While you are thinking of improving AccurateRip, I have a few suggestions.

I think it would be interesting to tell if the AccurateRip match is composed only of drives of the same type, or if there is a diversity of drive types.

This would help in two way :

1) If I re-rip the same disk on the same drive, I would know if I match my previous result or not.

2) If you assume that drives could have repeatable firmware bug, one would be able to gauge the confidence of his rip with more certitude, as more drive diversity is better.

For example, I can assume that Plextor is pretty popular for ripping. If I rip with a Plextor and I match with confidence 4, if all those 4 other match are also done on Plextor, I get less confidence than if all 4 are on different drives. See related discussions on reading small block size...

One quick way to implement that. For each AR record, you associate an offset field. For the first rip, when the AR record is created, you set the record offset to the offset of the drive that did the rip. For subsequent rip, if the offset is the same, you don't change anything. If the offset of the new rip differ from the record offset, you invalidate the offset. Then, when checking the AR record, I can compare the AR record offset with my own offset.

In a similar vein, you could keep in the record some idea of the diversity of ripping program used and ripping mode used. As we know, some program in some mode may introduce repeatable errors. Again, a greater diversity gives a greater confidence.

Thanks again for AccurateRip, and good luck with the new version...

Jean

AccurateRip - Future Direction

Reply #38
Has this been fixed in the April 2009 rewrite? Which route did you take regarding the CRC implementation? I missed this thread back in 2008. Basically plain CRC, just correctly implemented should have been fine. Even saving two CRCs calculated both with the new and old scheme, shouldn't be that much data nowadays.


AccurateRip - Future Direction

Reply #40
There is no reason why such a fix could not go in transparently, it would have to be implemented in all the respective programs, if you take EAC as an example, the development time would be best spent on allowing EAC to check across pressings, it makes AR much more useful.

AccurateRip - Future Direction

Reply #41
Change the hash calculation to CRC32 or something similar and  quick & easy checking against multiple pressings through calculation goes out the window.

AccurateRip - Future Direction

Reply #42
The existing algorithm can be fixed and yet maintain the ability to calculate different pressings, I will do it next week for dBpoweramp R14 - so that R14 will only submit to the dB with new fixed routine, but can check old and new. This will be branded as AR2, the fixed CRC as well as the ability to check pressings (by using the offset finding CRC). I will contact Andre to see if he is interested in updating EAC.


AccurateRip - Future Direction

Reply #44
No they do not, they hammer around 4000 CRCs looking for a match. Nothing is stopping them using the offset finding CRC to find the offset before calculating.

AccurateRip - Future Direction

Reply #45
Here is the proposed new CRC calculation, which should still allow the fast parallel calculation:

Code: [Select]
    DWORD AC_CRCNEW = 0;
    DWORD MulBy = 1;
    for (;; )
    {
        DWORD Value = {complete 32 bit sample comprising left and right channels};

        unsigned __int64 CalcCRCNEW = (unsigned __int64)Value * (unsigned __int64)MulBy;
        DWORD LOCalcCRCNEW = (DWORD)(CalcCRCNEW & (unsigned __int64)0xFFFFFFFF);
        DWORD HICalcCRCNEW = (DWORD)(CalcCRCNEW / (unsigned __int64)0x100000000);
        AC_CRCNEW+=HICalcCRCNEW;
        AC_CRCNEW+=LOCalcCRCNEW;

        MulBy++;
    }


Or I could switch to CRC32 which would deal with NULL samples better and a) encourage the use of the offset finding crc, b) field 1001 questions why track 1 and n do not match the CRC32 calculation.

AccurateRip - Future Direction

Reply #46
I'm slightly new and have found this topic quite interesting even if i can't understand every detail. This may sound like a dumb question, but for the sake of the newer people to the scene: What does this mean for us? For the CDs we have already ripped, if they say accurate in EAC or CUEtools under accuraterip, for all intensive purposes.... are they accurate? Based on the information i'm seeing in this topic, anything with a confidence of 2 or higher i'm assuming I can trust based on the shear odds that its almost impossible that 2 people would have the exact same errors and match in AR with bad rips and thus taint your result with you being the third person to match them with the exact same errors.  Am I right in assuming this?

Also someone made a comment about CDs with data tracks.  I have had a severe problems verifying all the tracks on any CD with a data track. Is this a common problem and is there a current workaround for verification? or is this one of the reasons you guys are reworking the code for your many ripping programs?

I'd like to thank all of you for the hard work you put into your software, and this website. Discovering all of this has refueled my passion for music in many ways, and given new life to my old CDs.

Cheers.


AccurateRip - Future Direction

Reply #47
For the CDs we have already ripped, if they say accurate in EAC or CUEtools under accuraterip, for all intensive purposes.... are they accurate?
Yes.

Based on the information i'm seeing in this topic, anything with a confidence of 2 or higher i'm assuming I can trust based on the shear odds that its almost impossible that 2 people would have the exact same errors and match in AR with bad rips and thus taint your result with you being the third person to match them with the exact same errors.  Am I right in assuming this?
Confidence of 1 is all that's needed provided it wasn't your submission.

Also someone made a comment about CDs with data tracks.  I have had a severe problems verifying all the tracks on any CD with a data track. Is this a common problem and is there a current workaround for verification?
Is this a problem because the data track is part of the copy protection?  If so then TOS #9 says we aren't allowed to discuss it.

AccurateRip - Future Direction

Reply #48
Is TOS #9 American Legislation? Because I live in Canada, and as far as I know theres a little thing called freedom of speech. But as far as I can tell its not to do with copy protection, its many cds with data tracks whether it be music videos or what have you.

AccurateRip - Future Direction

Reply #49
Quote
Is TOS #9 American Legislation?
No. Already discussed to death in here & here.

Summary: TOS is a set of rules that users of this board agree to follow when signing up. It has nothing to do with laws in your country (which you should follow ). The TOS #9 is protecting the admins/owners from any litigation in the country where the HA server reside. All in all, "HA TOS are the "laws" of our little society which keeps it all together without all going down the drain." (quoting myself).


About the data track verification problems: you do have a proper cuesheet logfile, right? I'm assuming that you do your verifications with CUETools. Other than not having a (proper) cuesheet logfile (or CD copy protected), I haven't heard of any verification problems with data track CD's that don't fall under normal verification problems that affect all CD's.

EDIT: fixed info.