Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: DAE Quality, views? (Read 10075 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

DAE Quality, views?

I am convinced the DAE Quality test might not represent a drives ability to read a pressed CD with scratches, rather to read a CD-R with artificial marker damage. When testing about a year ago I found CDs with marker damage did not seem to show the strong drives out from the weak drives (when using real cds with scratches). It should be a good test for c2 pointers, but there seem to be issues (a test on a 230a only got 50% for c2, where in real life c2 pointers are in the 99.99% range).

It is not my intention to take anything away from DAE Quality as there is nothing else that can test drives independantly.

DAE Quality, views?

Reply #1
To brain storm a better method: 5 CDs, 3 damaged in varing degrees, a simple program to read and generate quality reports and c2 accuracy from those cds. Said CDs are mailled to people interested in testing certin drives, so the question is who is in? (and what drives do you have to test).

DAE Quality, views?

Reply #2
To brain storm a better method: 5 CDs, 3 damaged in varing degrees, a simple program to read and generate quality reports and c2 accuracy from those cds. Said CDs are mailled to people interested in testing certin drives, so the question is who is in? (and what drives do you have to test).


I'm in.  I have more than a dozen different drive types, maybe two dozen if varied firmware versions come into play.  I can come up with a list in a week or so.

Some other things to consider when putting together the test:

- What are the OS and other software requirements of the testbed (XP SPx?  Vista?  Clean out upper/lower filter drivers and/or uninstall interfering apps first?  Clean OS install with up to date patches?)?
- How should the drives be connected (SCSI vs. mainboard IDE/SATA vs. add-on IDE/SATA vs. firewire vs. USB and what chipsets)?
- What information about the environment should the test app capture in addition to the per disc results (c2, etc.) and drive info (make, model, firmware version(s), serial, connected via)?
- Allow the tester to add add-hoc comments about each particular drive regarding age, condition, estimated hours of use, odd behavior experienced).
- Are there particular drives you are very interested in?  Drives that aren't useful to test at all (e.g. older SCSI ones)?
- If the test requires very specific damaged CDs, then mail the packages out with a dozen extra filled in (or even blank) label stickers to encourage the recipient to mail the CDs to the next recipient you specify, like an old fashioned tape tree.
- Make it difficult for the user to generate bad data and easy to autosubmit the good data and you'd have an autogenerated database in no time. 

-brendan

DAE Quality, views?

Reply #3
Sounds really interesting. But as bhoar already said, the whole project has to be planned very well first to get accurate and reliable results.

Some toughts:
- How the rips should be tested/verified? You probably need some reference file to do comparisons against.
- From making some DAE tests based on the EAC DAE quality project, I know that two drives of the same make and model sometimes can't generate the same results. So how to deal with wear and tear/dirt/whatever of drives - or generally speaking how to deal with conflicting results?
- what happens when the test CDs get damaged (in the mail, somebody unintentionally scratches the CDs even more while doing the tests, etc.).

I also would be interested in participating, but I don't have very much drives to test (Plextor PX-755A, Plextor PX-708A, Samsung SH-182D, Toshiba MD-1612, Toshiba SD-R5372 and some (no name/rebaged) drives in various notebooks).

DAE Quality, views?

Reply #4
> How the rips should be tested/verified?

2 of the CDs would be unscratched, that would be the reference.

If I can get 3 to 4 different people with the same drive then that would form a good basis.

DAE Quality, views?

Reply #5
I am convinced the DAE Quality test might not represent a drives ability to read a pressed CD with scratches, rather to read a CD-R with artificial marker damage.
If this is your main concern, would it not also be possible to buy a cheap pressed CD (bargain bin at your local CD store) and use it as the test corpus for a DAE quality extraction test? You could rip it with the drive in question and insure that you get an accurate rip using the most trusted methods, then purposely damage the CD in a method similar to the DAE test (marker and scratches) and re-rip and compare the results to the original rip.

In fact, you could probably (mis)use much of the DAE Quality kit to do these things - damage the test CD by following the plans, analyze the results with the analyze program, and use the C2 extractor to test the C2 of the drive.

The ideas that have been thrown around seem nice, but will require a lot of logistics, plus a plant that will press your test CDs. On the other hand, more tests are always welcome, especially high quality ones that don't have huge costs.

DAE Quality, views?

Reply #6
For any consistancy the same cds have to be used.

DAE Quality, views?

Reply #7
For any consistancy the same cds have to be used.

Even with a large enough sample size (participating group)?
Creature of habit.

DAE Quality, views?

Reply #8
If you want to make a database of CD/DVD-drives capabilities to do DAE on realistic errors, it would seem that all drives should perform under the same conditions (i.e. exact same CD defects), and compare that to the desired output.

It seems that the most realistic test would be a large set of various naturally "damaged" CDs, each ripped by a number of CD/DVD-readers, and compared to undamaged CDs of the same pressing.

Could Accuraterip be used for such purposes? As long as the CD is identified, at least one has a binary "correct", "non-correct" answer? Does/could accuraterip store the drive manufacturer?

If all accuraterip queries saved the drive used, I agree that the (large) number of correct/non-correct rippings by each individual (or make) of CD/DVD-drive could be an indicator of its real-world capabilities. That is if we assume that there is practically no connection between purchasing (say) a Samsung DVD-rom and treating your CDs very bad. If that was the case, then this test would fail.

-k