IPB

Welcome Guest ( Log In | Register )

Audio CD ROT, Red Book CD-DA data recovery, Disc Rot, How to recover precious audio CDR from set of four that are rotting
kethd
post Jan 24 2012, 23:03
Post #1





Group: Members
Posts: 15
Joined: 24-January 12
Member No.: 96683



Project: recover precious audio CDR from set of four that are rotting (using manual methods to maximize quality of recovered data)

In about 1996, my mother recorded herself singing 1940s folk songs with her brothers and sister. In about 2000, my cousin with a "recording studio" made a set of CDRs.

The CD copies worked when they were new. Now they are all badly rotted -- they look fine, but will not play. The CDs are greenish with full inkjet-printed labels. The ring is printed "119-H.002062194B09." (Does this tell who made the disc and when? It seems to be a serial number -- each disc has a different number, but the first part is always the same.)

This is my first experience with this type of bit rot. (If only I had known about the problem when it started!) Since this is precious material, I will spend endless hours trying to recover it as exactly as possible. I have not found any good tools or instructions for how to do this, so I will try to describe my plans and experiences here, in case they are helpful.

These CDs are almost impossible to read in an ordinary way. Occasionally it is possible to see the TOC. But EAC seems useless because even the lowest error-check setting seems to be too picky, and makes no progress reading the data.

I tried a pile of various CD/DVD drives in the hope that one would be magic, but no luck. The only drive that ever sees the TOC is a Rosewill ROD-EX001 which contains a Teac DV-W28S-R 1.0B. The drive only sees the TOC when it is cold, under 60-degF. Perhaps 40-50degF is best? Why does cold help?

It seems like the ideal way to recover data from severe disc rot would be to make micro-photo images and process the images. I don't have that technology, and have found no discussion of such serious forensics.

I found no tools for reading raw CD data, when the disc does will not "mount" (there seems to be some sort of auto-scan when the disc is inserted in the drive, and everything seems to hang until the drive is happy). Since EAC seemed unwilling to give me bad data, I discovered that fre:ac is more willing to give raw data. I could not use any included cdparanoia features, but was able to use jitter correction.

Out of the four discs, I was able to read one five times and one two times, so I have seven sets of 34 tracks. In general, they sound noisy, but the original sound is very audible. The rot seems worst at the beginning, and somewhat at the end.

The only relevant processing tool I have found so far is Audacity. There are various ways I could auto-process this data, but I want maximum control and understanding of the process. I am faced with an overwhelming quantity of data. To reconcile all these bits manually could take years!

I am starting with a short track in good shape. Audacity is usable (though inconvenient) for comparing tracks to see the differences. Because I am dealing with digital errors, I was expecting big, radical noise, that would be easy to spot. Wrong. The errors are quite subtle -- the hidden error correction is doing a very good job of guessing. This would be great if I were willing to just average the waveforms together or pick the best. But since I'd like to try to delete the errors, and use the "good parts" as much as possible, the fact that the errors are so minor will make finding them much harder!

It looks like the sound is not truly stereo - one track seems about one sample delayed and slightly different in amplitude compared to the other. I hope that the error correction did not know about this, and processed each channel independently. (Is this true?) My next step is to cross-correlate the channels to find the exact time-shift relationship, and figure out the amplitude relationship. Then when I am processing the data, and get to a point in dispute, I can use the level of discrepancy between the channels within a dataset to flag which data points are likely to be more unreliable. Well, that's my hope... I expect I'll have to write programs in BASIC to explore these approaches. Any better ideas? Do any programs exist for reconciling multiple sets of nearly-identical audio sound files? (Currently working on a not-powerful WindowsXP computer.)
Go to the top of the page
+Quote Post
 
Start new topic
Replies
kethd
post Jan 25 2012, 05:47
Post #2





Group: Members
Posts: 15
Joined: 24-January 12
Member No.: 96683



Cross-correlating the left and right channels of track 23 shows that the first (left) channel lags behind by almost two samples (actually somewhere around 1.7 samples, but I'd rather avoid interpolating samples for now).

The next step is to try to do a x-y scatter plot of the left-right channels, offset by this optimal 2-sample correlation peak, to get a feeling for how closely the channels match and whether it seems worth using the ratio of sample values as a measure of data validity.

This ugly little BASIC program was used to do the cross-correlation. It shows that for a simple LPCM .wav file it is easy to just skip over the header bytes and access the data area.

CODE
' CDrot.bas (SmallBASIC FLTK 0.10.6 Windows XP)
' CD disc rot processing
' kd 24 jan 2012

DIM LL(9),RR(9),LR(9)
? "THIS IS A TEST of CDrot.bas"
?
F=FREEFILE
OPEN "N1.wav" FOR INPUT AS #F
I=0
J=0
SM=0
ISM=0
ILR=0

WHILE NOT EOF(F)
I=I+1
L1=BGETC(F)
L2=BGETC(F)
R1=BGETC(F)
R2=BGETC(F)
L=L2*256+L1
IF L>(2^15) THEN L=L-(2^16)
R=R2*256+R1
IF R>(2^15) THEN R=R-(2^16)

IF I<13 THEN 199

INSERT LL,0,L
DELETE LL,10
INSERT RR,0,R
DELETE RR,10

FOR T = 1 TO 9
LR(T) = LR(T) + LL(5)*RR(T)
NEXT T
ILR=ILR+1

IF (L=0) AND (R=0) THEN 199

SM = SM + L*R
ISM = ISM+1
' PRINT I;": ",L,R," ";ISM;" - ";L*R;" - ";INT(SM/ISM);" - ";SM,,ILR;LR

J=J+1
IF J>20 THEN
PRINT ILR;LR
' INPUT "WAITING"; I$
J=0
ENDIF

199 WEND
? "END TEST of CDrot.bas"
STOP


This post has been edited by db1989: Jan 25 2012, 21:04
Reason for edit: For large items, please use [codebox] rather than [code].
Go to the top of the page
+Quote Post
kethd
post Jan 25 2012, 20:59
Post #3





Group: Members
Posts: 15
Joined: 24-January 12
Member No.: 96683



HOW TO USE AUDACITY TO COMPARE TWO CHANNEL TRACKS

Unfortunately, Audacity seems to have no function to automatically optimally align two waveforms (time-shift, amplitude, DC offset) for comparison purposes. And no way to visually superimpose two waveforms to see discrepancies.

Here is a manual proceedure for Audacity:
* Split stereo to mono
* Normalize each track. (If the amplitude match does not seem perfect, you could manually reduce the amplitude of one channel to improve the match.)
* Time-shift one channel as needed. To do this, zoom in on high-amplitude sections. Look for crisp-clear zero crossings that you can time-align. Keep zooming in until you clearly see individual samples that you can align exactly.
* After time-aligning, Invert one channel, then select both channels and apply "Mix and Render".
* Study and ponder this difference waveform. Can you think of any adjustments that would have yielded a smaller result? You can re-do the process with different amounts of time shifting, to see what is the best match, with the minimum difference.

So, I did not really need to write a BASIC program to do the cross-correlation -- this tedious manual process with Audacity would have been quicker.

And now that I see the resulting difference between the left and right channels, it looks like there is more stereo content than I had hoped. It does not look like I'll be able to identify bad data points very well by comparing the left and right channels. I'd still be curious to see an X-Y scatter plot of the channel values to study the degree of correlation, but that approach doesn't seem worth pursuing for now.

My plan is to proceed to write a BASIC program to find sets of data points that are different among the 7 copies, and see what ideas emerge from pondering them.
Go to the top of the page
+Quote Post
kethd
post Jan 26 2012, 19:25
Post #4





Group: Members
Posts: 15
Joined: 24-January 12
Member No.: 96683



BASIC PROGRAM TO MEASURE CORRELATION OF LEFT AND RIGHT CHANNELS IN .WAV FILE

Although I am putting aside the idea of using localized L-R correlation as a measure of validity of data points in my audio CD data recovery research, it still seems potentially interesting to measure the total L-R correlation in each raw copy of the data, as a clue to which copy might be more valid overall.

I don't know of any way to do this measurement in Audacity. It is possible to manually time-align the channels, and then subtract them to produce a difference waveform, which can be judged by eye -- but I don't know any Audacity function to quantify the total or average amplitude/energy/power of the resulting waveform.

Here is a BASIC program that produces such summary numbers, and stores them in a results file, based on the simpler version I wrote a couple days ago. The main thing I've learned is that this measure is almost identical for my data sets -- the peak correlation totals vary by only about one part in 15,000. It looks like turning on jitter correction in fre:ac helped, and that reading at 16x was optimum -- but the differences are so small that they probably don't mean much. The BASIC program takes a few minutes to process each 50 second .wav file. So even though there are many ways it could be much more efficient and run much faster, there is no need to bother yet.

CODE
' CDrot.bas (SmallBASIC FLTK 0.10.6 Windows XP)
' CD disc rot processing
' kd 26 jan 2012
' CROSS-CORRELATE LEFT AND RIGHT CHANNELS

DIM LL(9),RR(9),LR(9)
? "START of CDrot-CROSS-26JAN12.bas"
?
F=FREEFILE
FILNAM$="E1.wav"
OPEN FILNAM$ FOR INPUT AS #F
I=0
J=0
SM=0
ISM=0
ILR=0

WHILE NOT EOF(F)
I=I+1
L1=BGETC(F)
L2=BGETC(F)
R1=BGETC(F)
R2=BGETC(F)
L=L2*256+L1
IF L>(2^15) THEN L=L-(2^16)
R=R2*256+R1
IF R>(2^15) THEN R=R-(2^16)

IF I<13 THEN 199

INSERT LL,0,L
DELETE LL,10
INSERT RR,0,R
DELETE RR,10

FOR T = 1 TO 9
LR(T) = LR(T) + LL(5)*RR(T)
NEXT T
ILR=ILR+1

IF (L=0) AND (R=0) THEN 199

SM = SM + L*R
ISM = ISM+1
' PRINT I;": ",L,R," ";ISM;" - ";L*R;" - ";INT(SM/ISM);" - ";SM,,ILR;LR

J=J+1
IF J>999 THEN
PRINT ILR;LR
' INPUT "WAITING"; I$
J=0
ENDIF

199 WEND
CLOSE #F
? "EOF"
PRINT "FILE: ";FILNAM$;" ";DATE$;" ";TIME$
PRINT ILR;LR

OUT=FREEFILE
NEWLINE$=CHR(13)+CHR(10)
OPEN "CROSS-OUT.TXT" FOR APPEND AS #OUT
PRINT #OUT, "FILE: ";FILNAM$;" ";DATE$;" ";TIME$;" CDrot-CROSS-26JAN12.bas";NEWLINE$
PRINT #OUT, ILR;LR;NEWLINE$

FOR T = 1 TO 9
LR(T) = INT(SQR(LR(T)/ILR))
NEXT T
PRINT "SQRT of AVG: ";LR
PRINT #OUT, "SQRT of AVG: ";LR;NEWLINE$;NEWLINE$
CLOSE #OUT

? "END of CDrot-CROSS-26JAN12.bas"
STOP


This post has been edited by kethd: Jan 26 2012, 19:27
Go to the top of the page
+Quote Post
kethd
post Jan 27 2012, 20:53
Post #5





Group: Members
Posts: 15
Joined: 24-January 12
Member No.: 96683



BASIC PROGRAM UTILITY TOOL
TO INSPECT BLOCKS OF DIFFERENCES
IN MULTIPLE SETS OF ALMOST IDENTICAL .WAV FILES

The following BASIC program analyzes a set of nearly identical .wav files, finds block-chunks areas of difference and displays a grid of numbers for interactive inspection. It analyzes each such block and collects summary statistics into a results file.

I am experimenting with a set of 7 copies of a track from a set of audio CDs that have become unplayable (rotted). I was only able to get data from 2 discs: 5 copies from one disc and 2 copies from another.

Just eyeballing the grid of numbers, with the differing values highlighted, the first impression is that this data recovery project looks hopeful. In many cases the errors are isolated and should be easy to weed out. The blocks of errors are mostly not long runs. Usually there are not more than two values appearing for a given sample.

This 50 second stereo track is somewhat over 2,000,000 L-R 16-bit sample pairs. Running the program on all 7 copies produced this (highly edited) output:

E1.wav E2.wav E3.wav E4.wav E5.wav N1.wav N2.wav
2200888 BLOCK 8363: BLKLINESBAD 1938[1853,3677,2247,222,220,70,17,12,10]35 OVERS, MAX 1938
, VALSMAX 2[0,8216,144,2,1,0,0] // TOTMINMAX 3[3085,4054,1176,46,2,0,0]

Interpratation:
2200888 samples processed
8363 blocks (chunks) of errors
1853 blocks contained 1 line of differing samples
3677 blocks contained 2 lines of differing samples
2247 blocks contained 3 lines of differing samples...
35 blocks contained over 9 lines of differing samples
the longest block contained 1938 lines of differing samples

Each set of samples is analyzed to count how many different values appear, and count the total of "minority" values (how many values are different from the one that is most common). Each block has a max result for those two measures; these results are tallied for all the blocks.

8216 blocks contained a max of 2 different values for any one sample
144 blocks contained a max of 3 different values for any one sample
2 blocks contained a max of 4 different values for any one sample
1 blocks contained a max of 5 different values for any one sample
Usually there were only 2 different values, and there were no cases of 6 or 7 different values.

3085 blocks contained a max of 1 minority value for any one sample
4054 blocks contained a max of 2 minority value for any one sample
1176 blocks contained a max of 3 minority value for any one sample...
There were only 48 cases where the most common value was not in the majority, and no cases where all the values were different -- there were always at least two of the same value.

E4 and E5 were made by fre:ac without jitter correction. The other copies were made by fre:ac with jitter correction. It was not feasible to get any data with cdparanoia enabled in fre:ac. EAC was also unable to make any progress (it is very difficult to get any data at all from these rotted CD-Rs). Excluding E4 and E5 produces this output:

E1.wav E2.wav E3.wav N1.wav N2.wav
2201472 BLOCK 7524: BLKLINESBAD 588[1716,3405,1914,194,187,49,13,10,9]27 OVERS, MAX 588
, VALSMAX 0[0,7412,110,2,0,0,0] // TOTMINMAX 0[2688,4801,35,0,0,0,0]
OVERS:,10,11,10,10,23,10,10,11,14,10,12,13,10,31,13,10,10,11,10,425,11,10,11,10,10,11,588

The number of bad blocks of data has decreased by 8%. The number of blocks that had over 9 bad lines has decreased from 35 to 27. Only 35 blocks contained a case of 3 "minority" votes (out of 5), so the count of "majority-minority" blocks has been reduced from 48 to 35. The new "OVERS" output shows that almost all the blocks contained no more than 14 bad lines; the exceptions were 23, 31, and one big run of 425. (The 588 seems to be exactly one CD sector of glitch in the junk at the end of the track.)

The program takes about an hour to run on these sets of 50 seconds tracks, on an underpowered WinXP computer. Not good, but tolerable considering this is a free BASIC interpreter, that is pretty easy to use to test ad hoc changes, to get this utility tool to do what you need...

I've spared you the details of the grids of numbers that are output, which are really the simplest and most useful feature of this tool. It only takes the program about a minute to start producing such output. Staring at the numbers will give you a good idea whether this approach is likely to be fruitful for your data recovery project.

It seems like just auto-picking the median value would be a very effective approach to reconciling these test copies. The next step is to find a way to evaluate that idea. Access to the numerical realm is fundamental, but also having access to the associated waveforms graphics would be much better.

CODE
' CDrot.bas (SmallBASIC FLTK 0.10.6 Windows XP)
' CD disc rot processing
' kd 27 jan 2012
' FIND DIFFERENCES IN .WAV FILE AND DUMP DATA BLOCK
' DUMP3 - BEFORE AND AFTER DATA BLOCK VERSION, WITH BLOCK STATISTICS AND OUTPUT FILE

PRINT DATE$;" ";TIME$;" START of CDrot-DUMP3-27JAN12.bas"
?
OUT=FREEFILE
NEWLINE$=CHR(13)+CHR(10)
NEWLINE$=CHR(13) ' TRY TO FIND A WAY IN WINDOWS FOR TEXT FILES TO WORK IN BOTH NOTEPAD AND WORDPAD
OPEN "DUMP3-OUT.TXT" FOR APPEND AS #OUT
PRINT #OUT,;NEWLINE$
PRINT #OUT, DATE$;" ";TIME$;" START of CDrot-DUMP3-27JAN12.bas";NEWLINE$

DIM PAST$(9) ' STORE PAST DATA LINES
SAMECNT=999 ' COUNT SUCCESSIVE LINES OF ALL-SAME DATA
BLOCK=0 ' COUNT DATA BLOCKS DUMPED OUT
BLKBAD=0 : MAXBLKBAD=0 ' COUNT BAD LINES IN A DATA BLOCK
VALSMAX=0 ' MAXIMUM NUMBER OF DIFFERENT VALUES IN ONE CHANNEL SET
TOTMINMAX=0 ' MAXIMUM NUMBER OF MINORITY VALUES IN ONE CHANNEL SET
DIM VMAX(1 TO 7),TMAX(1 TO 7) ' ARRAYS OF VALSMAX AND TOTMINMAX BLOCK RESULT TALLIES
BLKBADOVR=0 : BBOVR$="" : DIM BLKBADA(1 TO 9) ' TALLY BLKBAD RESULTS
N=0 ' COUNT THE INPUT/OUTPUT DATA SAMPLES

REM OPEN INPUT FILES
DIM INFILE$(9),F(9)
INFILE$(1)="E1.wav"
INFILE$(2)="E2.wav"
INFILE$(3)="E3.wav"
' INFILE$(4)="E4.wav"
' INFILE$(5)="E5.wav"
INFILE$(6)="N1.wav"
INFILE$(7)="N2.wav"
INFILES=0
' N=15000 : PRINT "SKIPPING SAMPLES: ";N
FOR I = 1 TO 9
IF INFILE$(I)<>""
PRINT INFILE$(I);" ";
PRINT #OUT, INFILE$(I);" ";
F(I)=FREEFILE
OPEN INFILE$(I) FOR INPUT AS #F(I)
INFILES += 1
REM SKIP OVER FILE HEADERS
JJ=11 : IF LEFT(INFILE$(I),1)="N" THEN JJ=JJ+588 ' 588 SAMPLES ARE ONE SECTOR SHIFT
JJ=JJ+N ' SPEED UP START
FOR J = 1 TO 4*JJ
DISCARD=BGETC(F(I))
NEXT J
ENDIF
NEXT I
PRINT : PRINT INFILES;" INPUT FILES" : PRINT
PRINT #OUT,;NEWLINE$ : PRINT #OUT, INFILES;" INPUT FILES";NEWLINE$

REM READ AND PROCESS ONE SET OF SAMPLES FROM EACH INPUT FILE

K215=2^15 : K216=2^16 : KBIG=K216+K215
DIM OUT$(9)
DIFF=0

100
L$="" : R$=""
LD=0 : RD=0
LOLD=KBIG ' INITIALIZE WITH INVALID SAMPLE VALUES
ROLD=KBIG
DIM LA(9),RA(9) ' ARRAYS OF LEFT AND RIGHT VALUES
FOR I = 0 TO 9
LA(I)=KBIG : RA(I)=KBIG
NEXT I
FOR I = 1 TO 9
IF INFILE$(I)<>""
L1=BGETC(F(I))
L2=BGETC(F(I))
R1=BGETC(F(I))
R2=BGETC(F(I))
LL=L2*256+L1
RR=R2*256+R1
IF LL>=K215 THEN LL=LL-K216
IF RR>=K215 THEN RR=RR-K216
LA(I)=LL : RA(I)=RR
IF LL<>LOLD THEN LD=LD+1
IF RR<>ROLD THEN RD=RD+1
LLL$=" " : IF LL<>LOLD THEN LLL$=" *"
RRR$=" " : IF RR<>ROLD THEN RRR$=" *"
LOLD=LL : ROLD=RR
L$=L$+LLL$+FORMAT("#####0",LL)
R$=R$+RRR$+FORMAT("#####0",RR)
ENDIF
NEXT I

N=N+1 : ' IF N>15 THEN 900
LLL$=LEFT(" *********",LD)
RRR$=LEFT(" *********",RD)

'IF (LD>1 OR RD>1) THEN PRINT N,L$,R$

INSERT PAST$,0,(FORMAT("##,###,000",N)+L$+" /"+R$)
DELETE PAST$,10
SAME = NOT (LD>1 OR RD>1)

IF NOT SAME

SORT LA ' GROUP SAME VALUES IN ORDER
V=LA(0) : LA(0)=1 : I=1 ' REPLACE THE VALUES WITH A COUNT OF SAME ONES
WHILE I<=UBOUND(LA)
IF LA(I)=V
DELETE LA,I
LA(I-1) = LA(I-1)+1
ELSE
V=LA(I) : LA(I)=1 : I += 1
ENDIF
WEND
DELETE LA,UBOUND(LA) ' DELETE LAST VALUE WHICH IS COUNT OF INVALID KBIGS
SORT LA ' SORT THE VALID COUNTS OF SAME VALUES
VALS=UBOUND(LA)+1 : IF VALS>VALSMAX THEN VALSMAX=VALS
TOTMIN=SUM(LA)-LA(UBOUND(LA)) : IF TOTMIN>TOTMINMAX THEN TOTMINMAX=TOTMIN

SORT RA ' GROUP SAME VALUES IN ORDER
V=RA(0) : RA(0)=1 : I=1 ' REPLACE THE VALUES WITH A COUNT OF SAME ONES
WHILE I<=UBOUND(RA)
IF RA(I)=V
DELETE RA,I
RA(I-1) = RA(I-1)+1
ELSE
V=RA(I) : RA(I)=1 : I += 1
ENDIF
WEND
DELETE RA,UBOUND(RA) ' DELETE LAST VALUE WHICH IS COUNT OF INVALID KBIGS
SORT RA ' SORT THE VALID COUNTS OF SAME VALUES
VALS=UBOUND(RA)+1 : IF VALS>VALSMAX THEN VALSMAX=VALS
TOTMIN=SUM(RA)-RA(UBOUND(RA)) : IF TOTMIN>TOTMINMAX THEN TOTMINMAX=TOTMIN

ENDIF

SAMEMAX=3
IF SAMECNT=SAMEMAX ' POSSIBLE END OF DATA DUMP BLOCK
IF SAME
SAMECNT += 1 ' YES- END OF BLOCK
IF BLKBAD>9
BLKBADOVR += 1
BBOVR$=BBOVR$+","+STR(BLKBAD)
ELSE
BLKBADA(BLKBAD) = BLKBADA(BLKBAD)+1 ' TALLY RESULTS
ENDIF
IF BLKBAD>MAXBLKBAD THEN MAXBLKBAD=BLKBAD
VMAX(VALSMAX)=VMAX(VALSMAX)+1 : TMAX(TOTMINMAX)=TMAX(TOTMINMAX)+1 ' TALLY RESULTS
PRINT "BLOCK ";BLOCK;": BLKLINESBAD ";BLKBAD;BLKBADA;BLKBADOVR;" OVERS, MAX ";MAXBLKBAD;
PRINT ", VALSMAX ";VALSMAX;VMAX;" // TOTMINMAX ";TOTMINMAX;TMAX
PRINT "OVERS:";BBOVR$

PRINT #OUT,N;" BLOCK ";BLOCK;": BLKLINESBAD ";BLKBAD;BLKBADA;BLKBADOVR;" OVERS, MAX ";MAXBLKBAD;;NEWLINE$
PRINT #OUT, ", VALSMAX ";VALSMAX;VMAX;" // TOTMINMAX ";TOTMINMAX;TMAX;NEWLINE$
PRINT #OUT, "OVERS:";BBOVR$;NEWLINE$

JUNK$="" : ' IF VALSMAX>2 THEN INPUT "WAITING ";JUNK$
IF JUNK$="STOP" THEN 900
VALSMAX=0 : TOTMINMAX=0 ' RESET FOR NEXT BLOCK
ELSE
SAMECNT = 0
PRINT PAST$(0) ' NOT END OF BLOCK - KEEP DUMPING
BLKBAD += 1
ENDIF
ELSEIF SAMECNT>SAMEMAX ' NOT CURRENTLY OUTPUTTING
IF SAME
SAMECNT += 1 ' SAME BLANKNESS CONTINUES
ELSE
SAMECNT = 0 ' START DUMPING A BLOCK OUT
BLOCK += 1
PRINT "BLOCK ";BLOCK
PRINT PAST$(3)
PRINT PAST$(2)
PRINT PAST$(1)
PRINT PAST$(0)
BLKBAD=1 ' START COUNTING BAD LINES IN THIS BLOCK
ENDIF
ELSE
IF SAME
SAMECNT += 1
ELSE
SAMECNT = 0
BLKBAD += 1
ENDIF
PRINT PAST$(0) ' CONTINUE OUTPUTTING
ENDIF

' IF LD>4 THEN 900
' IF NOT EOF(F(1)) THEN 100
FOR I = 1 TO 9
IF INFILE$(I)<>""
IF EOF(F(I)) THEN 800
ENDIF
NEXT I
GOTO 100 ' LOOP BACK UP TO ANOTHER SET OF SAMPLES IF NO EOF
800
PRINT "EOF"
900
PRINT
PRINT N;" BLOCK ";BLOCK;": BLKLINESBAD ";BLKBAD;BLKBADA;BLKBADOVR;" OVERS, MAX ";MAXBLKBAD
PRINT ", VALSMAX ";VALSMAX;VMAX;" // TOTMINMAX ";TOTMINMAX;TMAX
PRINT "OVERS:";BBOVR$
PRINT #OUT,N;" BLOCK ";BLOCK;": BLKLINESBAD ";BLKBAD;BLKBADA;BLKBADOVR;" OVERS, MAX ";MAXBLKBAD;;NEWLINE$
PRINT #OUT, ", VALSMAX ";VALSMAX;VMAX;" // TOTMINMAX ";TOTMINMAX;TMAX;NEWLINE$
PRINT #OUT, "OVERS:";BBOVR$;NEWLINE$
FOR I = 1 TO 9
IF INFILE$(I)<>""
CLOSE #F(I)
ENDIF
NEXT I
PRINT #OUT, DATE$;" ";TIME$;" END OF CD rot DUMP3 27JAN12";NEWLINE$
PRINT #OUT,;NEWLINE$ : PRINT #OUT,;NEWLINE$
CLOSE #OUT
PRINT DATE$;" ";TIME$;" END OF CD rot DUMP3 27JAN12"
STOP
Go to the top of the page
+Quote Post

Posts in this topic
- kethd   Audio CD ROT, Red Book CD-DA data recovery, Disc Rot   Jan 24 2012, 23:03
- - kethd   Cross-correlating the left and right channels of t...   Jan 25 2012, 05:47
|- - kethd   HOW TO USE AUDACITY TO COMPARE TWO CHANNEL TRACKS ...   Jan 25 2012, 20:59
|- - kethd   BASIC PROGRAM TO MEASURE CORRELATION OF LEFT AND R...   Jan 26 2012, 19:25
|- - kethd   BASIC PROGRAM UTILITY TOOL TO INSPECT BLOCKS OF DI...   Jan 27 2012, 20:53
|- - kethd   BASIC PROGRAM UTILITY TOOL TO INSPECT SUPERIMPOSED...   Jan 30 2012, 00:33
|- - krabapple   Dude, seriously, you'll be better off just re-...   Jan 30 2012, 04:11
||- - kethd   QUOTE (krabapple @ Jan 29 2012, 22:11) Du...   Feb 3 2012, 06:23
|- - kethd   BASIC PROGRAM WAV-FILT-LIN TO RECONCILE MULTIPLE S...   Feb 3 2012, 06:33
|- - kethd   CD disc rot processing WAV-FILES-COPYR.bas COP...   Feb 4 2012, 04:12
- - hlloyge   Well, I am sorry, you're out of luck there - b...   Jan 25 2012, 08:16
|- - kethd   QUOTE (hlloyge @ Jan 25 2012, 02:16) You ...   Jan 25 2012, 20:16
- - 2Bdecided   To ask the bleeding obvious, what format were thes...   Jan 25 2012, 13:10
|- - kethd   QUOTE (2Bdecided @ Jan 25 2012, 07:10) To...   Jan 25 2012, 20:28
|- - 2Bdecided   QUOTE (kethd @ Jan 25 2012, 19:28) I have...   Jan 26 2012, 12:35
- - DVDdoug   QUOTE I'd still be curious to see an X-Y scatt...   Jan 25 2012, 21:24
- - GHammer   I have had luck in the past with ISOBuster. Haven...   Jan 26 2012, 02:29
|- - kethd   QUOTE (GHammer @ Jan 25 2012, 20:29) I ha...   Jan 26 2012, 19:08
- - gottogo99   I totally agree with 2Bdecided. Play the old cass...   Jan 26 2012, 13:55
|- - kethd   QUOTE (gottogo99 @ Jan 26 2012, 07:55) I ...   Jan 26 2012, 19:17
|- - gottogo99   QUOTE (kethd @ Jan 26 2012, 13:17) Unfort...   Jan 26 2012, 20:04
- - AndyH-ha   isobuster can be downloaded and installed and used...   Jan 26 2012, 21:41
- - kct_99   Have you tried out ExactAudioCopy yet ? http://exa...   Jan 27 2012, 03:15
|- - kethd   QUOTE (kct_99 @ Jan 26 2012, 21:15) Have ...   Jan 27 2012, 20:47
|- - greynol   QUOTE (kct_99 @ Jan 26 2012, 18:15) This ...   Jan 27 2012, 22:52
- - mjb2006   You might want to try the abandonware PerfectRip. ...   Jan 27 2012, 22:43
|- - kethd   QUOTE (mjb2006 @ Jan 27 2012, 16:43) You ...   Jan 29 2012, 23:48
- - knutinh   A topic that facinates me: Is it possible to get r...   Jan 28 2012, 09:58
- - LordWarlock   There are tools that allow low level reading for d...   Jan 28 2012, 16:59
- - kennedyb4   QUOTE (gottogo99 @ Jan 26 2012, 07:55) I ...   Jan 28 2012, 17:15


Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 



RSS Lo-Fi Version Time is now: 21st October 2014 - 01:40