IPB

Welcome Guest ( Log In | Register )

3 Pages V  < 1 2 3 >  
Reply to this topicStart new topic
AccurateRip - Future Direction
Eli
post Mar 10 2008, 16:16
Post #26





Group: Members
Posts: 1056
Joined: 16-October 03
Member No.: 9337



QUOTE (spoon @ Mar 10 2008, 10:08) *
How would the system know if it is a good submission?


If the submission is from either dBpoweramp or EAC and ripped in secure mode with secure results, especially if the whole disc is ripped and secure. This should be less of an issue with AR2 though since it looks like we will be able to cross-check somewhat between pressings.


--------------------
http://forum.dbpoweramp.com/showthread.php?t=21072
Go to the top of the page
+Quote Post
bilbo
post Mar 10 2008, 16:24
Post #27





Group: Members
Posts: 190
Joined: 16-April 07
Member No.: 42593



Using EAC for an example, if the report to the AR database contained information eac comparison results. If copy and test CRC's matched and the AR CRC is different between the submissions, you could replace the origional submission with the CRC matched results.


--------------------
Glass half full!
Go to the top of the page
+Quote Post
Eli
post Mar 10 2008, 17:31
Post #28





Group: Members
Posts: 1056
Joined: 16-October 03
Member No.: 9337



T&C does not account for consistent errors


--------------------
http://forum.dbpoweramp.com/showthread.php?t=21072
Go to the top of the page
+Quote Post
greynol
post Mar 10 2008, 19:41
Post #29





Group: Super Moderator
Posts: 10000
Joined: 1-April 04
From: San Francisco
Member No.: 13167



QUOTE (Eli @ Mar 10 2008, 09:31) *
T&C does not account for consistent errors

But it does account for inconsistent errors, whereas the original submission may not take into account the possibility for any errors. Remember, Bilbo is talking about replacing a submission made with the same user ID.

However, I find such an implementation unnecessarily cumbersome. I have faith that someone else will either come along and add a confidence to the rip or provide a new checksum if the original one wasn't correct and eventually someone else will bump out the bad result with a submission that matches.

AR is great because of agreeing submissions by multiple users, not by submissions by single users even if they are "secure".


--------------------
I should publish a list of forum idiots.
Go to the top of the page
+Quote Post
bilbo
post Mar 10 2008, 22:50
Post #30





Group: Members
Posts: 190
Joined: 16-April 07
Member No.: 42593



I am just concerned with the confidence levels. Say if 52 rip a disk and submit the results.
two used secure mode (and their results match), and 50 used burst mode. All burst modes failed AR checksum. These 50 people re-rip in secure mode and the resulting CRC's all match the first two. Under the current system, if another person checks a rip, he will only get a confidence level of 2, wheras it should really be 52, because the good second submissions were rejected.

Another option that would help, would be to allow the block submission of the first results, if they are going to re-rip.


--------------------
Glass half full!
Go to the top of the page
+Quote Post
greynol
post Mar 10 2008, 23:24
Post #31





Group: Super Moderator
Posts: 10000
Joined: 1-April 04
From: San Francisco
Member No.: 13167



<quoting myself for the hundredth time>
A confidence of 2 is good enough in my book, a confidence of 1 is fine with me as well so long as it wasn't my submission.
</quoting myself for the hundredth time>

FWIW, when I submit results, I do so manually. I only send the results that I want to send. The rest are blown away prior to ripping a title I plan on submitting. Personally, it doesn't matter if others aren't as picky. Out of submissions from 50 individual users with the same pressing at least one of them is bound to be accurate, even if it was done in burst mode. I think this is more realistic than saying all 50 people who get bad rips in burst mode are going to get good rips by virtue of switching to a secure mode.


--------------------
I should publish a list of forum idiots.
Go to the top of the page
+Quote Post
greynol
post Mar 10 2008, 23:53
Post #32





Group: Super Moderator
Posts: 10000
Joined: 1-April 04
From: San Francisco
Member No.: 13167



Anyhow, as far as checksums go, I think CRC32 is fine and if you're worried that people will be concerned that the checksums for the first and last track don't match the one given by your ripper, just NOT them.


--------------------
I should publish a list of forum idiots.
Go to the top of the page
+Quote Post
Eli
post Mar 11 2008, 00:45
Post #33





Group: Members
Posts: 1056
Joined: 16-October 03
Member No.: 9337



QUOTE (bilbo @ Mar 10 2008, 16:50) *
I am just concerned with the confidence levels. Say if 52 rip a disk and submit the results.
two used secure mode (and their results match), and 50 used burst mode. All burst modes failed AR checksum. These 50 people re-rip in secure mode and the resulting CRC's all match the first two. Under the current system, if another person checks a rip, he will only get a confidence level of 2, wheras it should really be 52, because the good second submissions were rejected.

Another option that would help, would be to allow the block submission of the first results, if they are going to re-rip.


I may be wrong, but I think if you re-rip and re-submit it would be added to the database. If it matches, it will add to the match count. If it does not match any previous rips it is added to the offline database, or if there are no previous submissions for that disc (which in this case there are) it would be added to the main database.

What I have suggested, is that if a rip is accurate but does not match any AR submissions, especially if this is the case for the entire disc, then it should be added to the Active AR Database that is used for matching even if it is only 1 submission. But it seems as though this may not be an issue as we will be able to cross check between pressings.


--------------------
http://forum.dbpoweramp.com/showthread.php?t=21072
Go to the top of the page
+Quote Post
bilbo
post Mar 11 2008, 00:56
Post #34





Group: Members
Posts: 190
Joined: 16-April 07
Member No.: 42593



@Eli
I brought this up because op Spoon's statement earlier in the thread:

"Yes each submission has a record, resubmissions are dropped, not replaced."


--------------------
Glass half full!
Go to the top of the page
+Quote Post
Eli
post Mar 11 2008, 02:34
Post #35





Group: Members
Posts: 1056
Joined: 16-October 03
Member No.: 9337



QUOTE (bilbo @ Mar 10 2008, 18:56) *
@Eli
I brought this up because op Spoon's statement earlier in the thread:

"Yes each submission has a record, resubmissions are dropped, not replaced."


Thanks, I did not notice that. I agree, its not a good idea. Especially since re-rips are probably more likely to be accurate as additional measures have probably been taken to correct bad rips.


--------------------
http://forum.dbpoweramp.com/showthread.php?t=21072
Go to the top of the page
+Quote Post
greynol
post Mar 11 2008, 03:19
Post #36





Group: Super Moderator
Posts: 10000
Joined: 1-April 04
From: San Francisco
Member No.: 13167



QUOTE (Eli @ Mar 10 2008, 18:34) *
Especially since re-rips are probably more likely to be accurate as additional measures have probably been taken to correct bad rips.
That's an excellent point.

I say replace the old submission unless the old one has been verified with a submission from a different user.

I don't think the answer lies in submitting T&C data or information about the ripping configuration.


--------------------
I should publish a list of forum idiots.
Go to the top of the page
+Quote Post
CoolHandZeke
post Mar 16 2008, 00:58
Post #37





Group: Members
Posts: 19
Joined: 15-March 08
From: Seattle, WA
Member No.: 52062



MD5! Widely used, 128-bit hash value...

Used extensively in the field of computer/digital forensics. Certainly would benefit dBpowerAmp, IMHO. smile.gif


--------------------
"He's a natural born world-shaker." -Dragline
Go to the top of the page
+Quote Post
Jean Tourrilhes
post Mar 16 2008, 07:02
Post #38





Group: Members
Posts: 37
Joined: 13-March 08
Member No.: 52008



Hi,

I already posted that on the dBpoweramp forums, but the discussion seems dead on that side...

While you are thinking of improving AccurateRip, I have a few suggestions.

I think it would be interesting to tell if the AccurateRip match is composed only of drives of the same type, or if there is a diversity of drive types.

This would help in two way :

1) If I re-rip the same disk on the same drive, I would know if I match my previous result or not.

2) If you assume that drives could have repeatable firmware bug, one would be able to gauge the confidence of his rip with more certitude, as more drive diversity is better.

For example, I can assume that Plextor is pretty popular for ripping. If I rip with a Plextor and I match with confidence 4, if all those 4 other match are also done on Plextor, I get less confidence than if all 4 are on different drives. See related discussions on reading small block size...

One quick way to implement that. For each AR record, you associate an offset field. For the first rip, when the AR record is created, you set the record offset to the offset of the drive that did the rip. For subsequent rip, if the offset is the same, you don't change anything. If the offset of the new rip differ from the record offset, you invalidate the offset. Then, when checking the AR record, I can compare the AR record offset with my own offset.

In a similar vein, you could keep in the record some idea of the diversity of ripping program used and ripping mode used. As we know, some program in some mode may introduce repeatable errors. Again, a greater diversity gives a greater confidence.

Thanks again for AccurateRip, and good luck with the new version...

Jean
Go to the top of the page
+Quote Post
rpp3po
post Jan 30 2010, 16:03
Post #39





Group: Developer
Posts: 1126
Joined: 11-February 03
From: Germany
Member No.: 4961



Has this been fixed in the April 2009 rewrite? Which route did you take regarding the CRC implementation? I missed this thread back in 2008. Basically plain CRC, just correctly implemented should have been fine. Even saving two CRCs calculated both with the new and old scheme, shouldn't be that much data nowadays.

This post has been edited by rpp3po: Jan 30 2010, 16:05
Go to the top of the page
+Quote Post
greynol
post Jan 30 2010, 22:46
Post #40





Group: Super Moderator
Posts: 10000
Joined: 1-April 04
From: San Francisco
Member No.: 13167



No and it's not going to be fixed. If it used a CRC then CUETools wouldn't be able to do what it does.


--------------------
I should publish a list of forum idiots.
Go to the top of the page
+Quote Post
spoon
post Jan 31 2010, 10:10
Post #41


dBpowerAMP developer


Group: Developer (Donating)
Posts: 2745
Joined: 24-March 02
Member No.: 1615



There is no reason why such a fix could not go in transparently, it would have to be implemented in all the respective programs, if you take EAC as an example, the development time would be best spent on allowing EAC to check across pressings, it makes AR much more useful.


--------------------
Spoon http://www.dbpoweramp.com
Go to the top of the page
+Quote Post
greynol
post Jan 31 2010, 20:44
Post #42





Group: Super Moderator
Posts: 10000
Joined: 1-April 04
From: San Francisco
Member No.: 13167



Change the hash calculation to CRC32 or something similar and quick & easy checking against multiple pressings through calculation goes out the window.

This post has been edited by greynol: Jan 31 2010, 21:02


--------------------
I should publish a list of forum idiots.
Go to the top of the page
+Quote Post
spoon
post Jan 31 2010, 21:20
Post #43


dBpowerAMP developer


Group: Developer (Donating)
Posts: 2745
Joined: 24-March 02
Member No.: 1615



The existing algorithm can be fixed and yet maintain the ability to calculate different pressings, I will do it next week for dBpoweramp R14 - so that R14 will only submit to the dB with new fixed routine, but can check old and new. This will be branded as AR2, the fixed CRC as well as the ability to check pressings (by using the offset finding CRC). I will contact Andre to see if he is interested in updating EAC.


--------------------
Spoon http://www.dbpoweramp.com
Go to the top of the page
+Quote Post
greynol
post Jan 31 2010, 21:24
Post #44





Group: Super Moderator
Posts: 10000
Joined: 1-April 04
From: San Francisco
Member No.: 13167



What about CUETools and XLD? You know they don't use the offset finding hash (which is not a CRC), right?

This post has been edited by greynol: Jan 31 2010, 21:26


--------------------
I should publish a list of forum idiots.
Go to the top of the page
+Quote Post
spoon
post Jan 31 2010, 22:26
Post #45


dBpowerAMP developer


Group: Developer (Donating)
Posts: 2745
Joined: 24-March 02
Member No.: 1615



No they do not, they hammer around 4000 CRCs looking for a match. Nothing is stopping them using the offset finding CRC to find the offset before calculating.


--------------------
Spoon http://www.dbpoweramp.com
Go to the top of the page
+Quote Post
spoon
post Jan 31 2010, 22:33
Post #46


dBpowerAMP developer


Group: Developer (Donating)
Posts: 2745
Joined: 24-March 02
Member No.: 1615



Here is the proposed new CRC calculation, which should still allow the fast parallel calculation:

CODE
    DWORD AC_CRCNEW = 0;
    DWORD MulBy = 1;
    for (;; )
    {
        DWORD Value = {complete 32 bit sample comprising left and right channels};

        unsigned __int64 CalcCRCNEW = (unsigned __int64)Value * (unsigned __int64)MulBy;
        DWORD LOCalcCRCNEW = (DWORD)(CalcCRCNEW & (unsigned __int64)0xFFFFFFFF);
        DWORD HICalcCRCNEW = (DWORD)(CalcCRCNEW / (unsigned __int64)0x100000000);
        AC_CRCNEW+=HICalcCRCNEW;
        AC_CRCNEW+=LOCalcCRCNEW;

        MulBy++;
    }


Or I could switch to CRC32 which would deal with NULL samples better and a) encourage the use of the offset finding crc, b) field 1001 questions why track 1 and n do not match the CRC32 calculation.

This post has been edited by spoon: Jan 31 2010, 22:33


--------------------
Spoon http://www.dbpoweramp.com
Go to the top of the page
+Quote Post
Skybrowser
post Feb 1 2010, 01:45
Post #47





Group: Members
Posts: 52
Joined: 7-January 10
Member No.: 76815



I'm slightly new and have found this topic quite interesting even if i can't understand every detail. This may sound like a dumb question, but for the sake of the newer people to the scene: What does this mean for us? For the CDs we have already ripped, if they say accurate in EAC or CUEtools under accuraterip, for all intensive purposes.... are they accurate? Based on the information i'm seeing in this topic, anything with a confidence of 2 or higher i'm assuming I can trust based on the shear odds that its almost impossible that 2 people would have the exact same errors and match in AR with bad rips and thus taint your result with you being the third person to match them with the exact same errors. Am I right in assuming this?

Also someone made a comment about CDs with data tracks. I have had a severe problems verifying all the tracks on any CD with a data track. Is this a common problem and is there a current workaround for verification? or is this one of the reasons you guys are reworking the code for your many ripping programs?

I'd like to thank all of you for the hard work you put into your software, and this website. Discovering all of this has refueled my passion for music in many ways, and given new life to my old CDs.

Cheers.

Go to the top of the page
+Quote Post
greynol
post Feb 1 2010, 03:39
Post #48





Group: Super Moderator
Posts: 10000
Joined: 1-April 04
From: San Francisco
Member No.: 13167



QUOTE (Skybrowser @ Jan 31 2010, 16:45) *
For the CDs we have already ripped, if they say accurate in EAC or CUEtools under accuraterip, for all intensive purposes.... are they accurate?
Yes.

QUOTE (Skybrowser @ Jan 31 2010, 16:45) *
Based on the information i'm seeing in this topic, anything with a confidence of 2 or higher i'm assuming I can trust based on the shear odds that its almost impossible that 2 people would have the exact same errors and match in AR with bad rips and thus taint your result with you being the third person to match them with the exact same errors. Am I right in assuming this?
Confidence of 1 is all that's needed provided it wasn't your submission.

QUOTE (Skybrowser @ Jan 31 2010, 16:45) *
Also someone made a comment about CDs with data tracks. I have had a severe problems verifying all the tracks on any CD with a data track. Is this a common problem and is there a current workaround for verification?
Is this a problem because the data track is part of the copy protection? If so then TOS #9 says we aren't allowed to discuss it.


--------------------
I should publish a list of forum idiots.
Go to the top of the page
+Quote Post
Skybrowser
post Feb 1 2010, 10:45
Post #49





Group: Members
Posts: 52
Joined: 7-January 10
Member No.: 76815



Is TOS #9 American Legislation? Because I live in Canada, and as far as I know theres a little thing called freedom of speech. But as far as I can tell its not to do with copy protection, its many cds with data tracks whether it be music videos or what have you.
Go to the top of the page
+Quote Post
Akkurat
post Feb 1 2010, 13:05
Post #50


REACT Mod developer


Group: Developer
Posts: 929
Joined: 14-November 07
From: Finland
Member No.: 48750



QUOTE
Is TOS #9 American Legislation?

No. Already discussed to death in here & here.

Summary: TOS is a set of rules that users of this board agree to follow when signing up. It has nothing to do with laws in your country (which you should follow smile.gif). The TOS #9 is protecting the admins/owners from any litigation in the country where the HA server reside. All in all, "HA TOS are the "laws" of our little society which keeps it all together without all going down the drain." (quoting myself).


About the data track verification problems: you do have a proper cuesheet logfile, right? I'm assuming that you do your verifications with CUETools. Other than not having a (proper) cuesheet logfile (or CD copy protected), I haven't heard of any verification problems with data track CD's that don't fall under normal verification problems that affect all CD's.

EDIT: fixed info.

This post has been edited by Akkurat: Feb 1 2010, 18:00
Go to the top of the page
+Quote Post

3 Pages V  < 1 2 3 >
Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 



RSS Lo-Fi Version Time is now: 3rd September 2014 - 08:05