IPB

Welcome Guest ( Log In | Register )

 
Reply to this topicStart new topic
par2 problems
jrbamford
post Sep 23 2003, 20:16
Post #1





Group: Members
Posts: 309
Joined: 1-December 01
Member No.: 569



I just made a par2 backup for a collection ready for DVD.. I ended up having nearly 4gig data, 400meg or so for the par2 files...

it took an hour or so to make, i came back and clicked verify and found that some par files said they were damaged straight away, with it just saying

damaged (n blocks available)

it then says repair not needed as i assume the files are obviously ok... any ideas what this is about.. could it be that something went wrong when they were being created, or is this normal for files to be damaged straight away!?!?


--------------------
Binaural recordings of mine: http://binaural.jimtreats.com
Go to the top of the page
+Quote Post
tigre
post Sep 23 2003, 23:03
Post #2


Moderator


Group: Members
Posts: 1434
Joined: 26-November 02
Member No.: 3890



This could be caused by hardware problems, e.g. bad RAM. I once had similar problems: All software RAM tests I performed said everything's fine (including Quake demo etc.) but bios selftest (slow/extended one) reported an error. When copying large files or burning files to CD I got 1 random error in ~ 100 MB of data. New RAM stick -> problem solved.


--------------------
Let's suppose that rain washes out a picnic. Who is feeling negative? The rain? Or YOU? What's causing the negative feeling? The rain or your reaction? - Anthony De Mello
Go to the top of the page
+Quote Post
sven_Bent
post Sep 23 2003, 23:29
Post #3





Group: Members
Posts: 691
Joined: 15-December 01
From: Denmark
Member No.: 655



QUOTE (tigre @ Sep 23 2003, 11:03 PM)
All software RAM tests I performed said everything's fine (including Quake demo etc.) ...

this is nowhere a ramtest.
You should consider using af real memory tester like memtestx86 (opensource and free)
www.memtest86.com

so far i have had 1 stick of ram that was bad and memtest86 did not found as bad in the normal test run.
just ot clearify... i test more then 5 sticks of ram a day.

This post has been edited by sven_Bent: Sep 23 2003, 23:31


--------------------
Sven Bent - Denmark
Go to the top of the page
+Quote Post
tigre
post Sep 24 2003, 00:15
Post #4


Moderator


Group: Members
Posts: 1434
Joined: 26-November 02
Member No.: 3890



OK. I haven't expressed myself clearly. IIRC I have used ~ 10 different RAM test programs after noticing the error message on system startup, among them memtestx86 without any result. In cases like this, I was told, running quake demo for several hours could produce crashes that help to notice remaining problems (can be ram - can be other component) that won't be detected otherwise. I wasn't able to verify if using quake demo for this kind of test is useful or not, because nothing crashed ... . So I guess it's a good idea at least to use software made for testing ram 1st if a related problem is suspected.


--------------------
Let's suppose that rain washes out a picnic. Who is feeling negative? The rain? Or YOU? What's causing the negative feeling? The rain or your reaction? - Anthony De Mello
Go to the top of the page
+Quote Post
PeterBClements
post Sep 26 2003, 09:08
Post #5





Group: Members
Posts: 43
Joined: 11-May 03
Member No.: 6543



QUOTE (jrbamford @ Sep 23 2003, 08:16 PM)
I just made a par2 backup for a collection ready for DVD.. I ended up having nearly 4gig data, 400meg or so for the par2 files...

it took an hour or so to make, i came back and clicked verify and found that some par files said they were damaged straight away

Just to be clear, you are saying that QuickPar says the .par2 files are damaged but that the data files are ALL good (and hence that repair is not necessary).

Can you confirm whether or not the number of blocks reported as available in the par2 files actually matches the "+nnn" part of the filename?

Since you say "some par files said they were damaged" I assume you created a number of them. If so, how small is the smallest "damaged" one, and how small the smallest "good" one. If they are small enough I would like you to send them to me so that I can analyse them and figure out what is wrong.

PS I assume you used version 0.6 of QuickPar.
Go to the top of the page
+Quote Post
jrbamford
post Sep 26 2003, 21:35
Post #6





Group: Members
Posts: 309
Joined: 1-December 01
Member No.: 569



Ok well i've put a grab from quickpar (which yes is version 6) up in the uploads section

http://www.hydrogenaudio.org/forums/index....ST&f=35&t=13602

and as you can see, no the number of blocks is less on the damaged ones, hence damaged blocks i guess

as you can see the smallest broken file is 15meg, the smallest ok one is 1meg... i have no problem uploading this but am at the moment without any decent temporary space to put such things.. with blocks actually being damaged i guess that just means its screwed and that perhaps my hardware isn't right!?!? i will try to copy across the files and try it on another machine, or i guess i could try it over the network... its going to take a while either way...

Thanks


--------------------
Binaural recordings of mine: http://binaural.jimtreats.com
Go to the top of the page
+Quote Post
PeterBClements
post Sep 26 2003, 22:37
Post #7





Group: Members
Posts: 43
Joined: 11-May 03
Member No.: 6543



QUOTE (jrbamford @ Sep 26 2003, 09:35 PM)
and as you can see, no the number of blocks is less on the damaged ones, hence damaged blocks i guess

as you can see the smallest broken file is 15meg, the smallest ok one is 1meg... i have no problem uploading this but am at the moment without any decent temporary space to put such things.. with blocks actually being damaged i guess that just means its screwed and that perhaps my hardware isn't right!?!? i will try to copy across the files and try it on another machine, or i guess i could try it over the network... its going to take a while either way...

Thanks

Yep, less blocks reported in the par2 files confirms that they are damaged.

Could you try the following:

Make a backup copy of binaurals.vol013+010.par2 and then use quickpar to recreate just that single par2 file (i.e. specify "first block = 13, number of blocks = 10, number of files = 1).

Then do a binary file comparison of the backup and the new file.
Go to the top of the page
+Quote Post
jrbamford
post Sep 26 2003, 23:28
Post #8





Group: Members
Posts: 309
Joined: 1-December 01
Member No.: 569



Thanks Peter i will do...

Since last posting i copied everything across my network to another machine and made a recovery set... its just finished and everything was fine... I noted it was making slightly different sized recovery data so am now trying to make that size on the machine that fails to make.. I've took a backup of the settings that were left in quickpar from the last make (which i assume broke) and if this machine succeeds to make this set, i will immediately try to make the same settings as had previously failed...

ANother thing this other machine is usually my PVR machine... i am pretty sure that it was recording tv at the time when i made the other PAR2 files.. I'm now trying it with it totally idle... it may help but i'd rather it were slower and not faulty at the end of it smile.gif

I am still quite bewildered by the wide range of options with PAR2 but following your recent advice i assume i have to set the redundancy and block sizes to be the same.. to be able to do what you just requested of recaculating one of the blocks... its not a prob as i have the screen grab of the settings that previously failed so i can reset it back to that (just a slightly higher redundancy is all thats different)

If we end up discovering that there is something wrong with my machine then i believe your par2 code may be adopted as a good overclocker stress test program.. I've previously done system tests that failed on other machines with this machine that failed with par2... i'm not excitied by the prospect of it being faulty in some way tho sad.gif


--------------------
Binaural recordings of mine: http://binaural.jimtreats.com
Go to the top of the page
+Quote Post
jrbamford
post Sep 27 2003, 01:53
Post #9





Group: Members
Posts: 309
Joined: 1-December 01
Member No.: 569



ok well its late but on the machine that failed twice at the one setting, i've just managed to successfully create a recovery set with the same settings as on the other machine that succeeded... I now need to try the failed configuration on the other machine and see if that too fails... if it does do I'm starting to think it may be a bug... (two seperate hardware configurations both failing, yet succeeding with slightly different redundancy sizes)

I tried just now to do the test you mentioned above but the number of files was greyed out so it created 4 or so files rather than 1... as the only difference between success and failure is redundancy amount i dont think this creating a small fileset will work in reproducing the failure...

Its late now so i will try tomorrow (taking an hour is a pain) ... will report back any results... not exactly an easy data set to send about.. if it is a bug its obviously obscure to require 4gig of data and a specific setting to reproduce it.. if it IS a bug that is smile.gif


--------------------
Binaural recordings of mine: http://binaural.jimtreats.com
Go to the top of the page
+Quote Post
PeterBClements
post Sep 27 2003, 11:35
Post #10





Group: Members
Posts: 43
Joined: 11-May 03
Member No.: 6543



The "Number of recovery files" option is greyed out if the "Recovery File Size" option is set to "Variable (limited to size of largest data file)". If you select either of the other two choices, then you will be able to adjust the file count down to 1.

Over the past few months, I've had a number of users in your situation (i.e. where it sometimes fails on one computer), and so far it has always turned out to be a hardware fault.

In fact I have experienced this myself with a faulty 512MB memory module. Most of the time applications would work perfectly, but sometimes applications would do strange things and occasionally I would get a BSOD. It was only when I was debugging an early version of QuickPar and trying to figure out why it was occasionally failing that I tracked down the cause. Once the memory was replaced, the problem went away.

One of the reasons QuickPar seems to show up memory faults is because when creating par2 files it allocates up to 1/3 of the total physical memory. This increases the likelihood that if there is a faulty bit of memory, Windows will give it to QuickPar to use.
Go to the top of the page
+Quote Post

Reply to this topicStart new topic
1 User(s) are reading this topic (1 Guests and 0 Anonymous Users)
0 Members:

 



RSS Lo-Fi Version Time is now: 22nd August 2014 - 03:36