Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: Complete Video Evaluation Framework (Read 8938 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Complete Video Evaluation Framework

Video Quality Studio has reached version 0.3_01 and it now incorporates a tool for Subjective video evaluation using formal assessment methods.

It is the first framework to implement both subjective and objective metrics and it is distributed for free.

Edit: 0.32 is out and supports a method to pseudo-ABX two samples.

More Information
Download

Complete Video Evaluation Framework

Reply #1
Quote
...using formal assessment methods.

Just out of curiosity (and not trying to nitpick)

Are these methods based on a documented norm, like ITU R.BS 1116-1 is for audio?

Complete Video Evaluation Framework

Reply #2
I haven't cared enough to look up if they are part of a standard but DSCQS in particular is described in the MPEG-4 book and has been used from MPEG for validation of MPEG-4...

Complete Video Evaluation Framework

Reply #3
You can find DSCQS in ITU-R BT.500 Recommandation...
"Methodology for the subjective assessment of the quality of television pictures"
A Dwarf has its own way to fight !

Complete Video Evaluation Framework

Reply #4
Nice, that's good to know!

May I ask what your background is Ghim?

Complete Video Evaluation Framework

Reply #5
I'm just a simple student in signal (picture, sound, video) processing...

But I'm very interested by video quality and I'm currently doing a placement in this domain. So I read quite some documentations on the subject... That's all...
A Dwarf has its own way to fight !

Complete Video Evaluation Framework

Reply #6
I hate cross posting so just to note that 0.32 is out.

Changes and everything can be found at the website and one of the threads here

Complete Video Evaluation Framework

Reply #7
I'm using the subjective evaluator to test two RV10 clips, wrapped in Matroska containers. One clip is set to reference, the other as test-to-reference. The first clip plays fine, but the one that plays afterwards has a strange color cast. I get the same color cast problem when I try to play the two clips simultaneously in two instances of MPC. In other words, it's as if VQS is starting a second instance of a video player to play back the second test clip, as opposed to reusing the first instance.

Any idea why this is happening? Would you like more information or testing done by me? This is on a Geforce3 with the latest Detonator drivers, DirectX, and WindowsXP updates.

Complete Video Evaluation Framework

Reply #8
This has to do with the surfaces being used.  The first instance of MPC is probably being rendered in the Overlay.  And I believe only one app can use overlay at once.  So, the second video will be rendered without overlay, and hence different settings.

You could play with surfaces being used to see if one uses better colors than the other.

Complete Video Evaluation Framework

Reply #9
Okay, good to know. I was assuming it had something to do with the video overlay only handling one video at a time. I tried taking a screenshot of the two MPC players, and of course the one that was using video overlay turned out black.

The question is, why isn't VQS using the overlay for the second clip? I can't give a proper subjective measurement when the second clip is always severely degraded.

Complete Video Evaluation Framework

Reply #10
my favorite method to compare two (or more) videos, and have overlay for both, uses Avisynth:

a = AVISource("source1.avi")  # or DirectShowSource
b = AVISource("source2.avi")
v = StackVertical(a,b)
return v

Save as compare.avs, then open and play compare.avs in for instance Media Player Classic. You can even add subtitles with:
a=Subtitle(a, "codec a, parameters a")
of course, then it will not be a blind comparison any more, but you can say just "A" and "B".

and you can make the video play extra slowly by changing the line with StackVertical to
v = StackVertical(a,b).AssumeFPS(5) # or some other FPS number
Sr. Codec Engineer (video) | RealNetworks Codec Group | helixcommunity.org 
This information is provided "AS IS" with no warranties,  grants no rights, and reflects my personal opinion.

Complete Video Evaluation Framework

Reply #11
Quote
my favorite method to compare two (or more) videos, and have overlay for both, uses Avisynth:

a = AVISource("source1.avi")  # or DirectShowSource
b = AVISource("source2.avi")
v = StackVertical(a,b)
return v

This could be useful for me. Unfortunately, I'm testing a screwy video: full-resolution, anamorphic anime encoded in Real Video with the drop-dupe prefilter. I got both videos playing using your technique, but they were stretched vertically, so I added two instances of SimpleResize. Seeking and trim() take forever, though, and the mplayerc.exe process often has to be terminated manually.

If I use clips trimmed with rmeditor, then I get warnings that the video goes out of sync at the end of the clip. If the Real Video clips are embedded in Matroska containers, MPC just hangs at the end of playback, with no warning.

My .avs:
Code: [Select]
LoadPlugin("D:\Program Files\Gordian Knot\SimpleResize.dll")
a = DirectShowSource("G:\testvideo1.rmvb", fps=24)
b = DirectShowSource("G:\testvideo2.rmvb", fps=24)
a = SimpleResize(a, 720, 368)
b = SimpleResize(b, 720, 368)
v = StackVertical(a,b)
v = trim(v, 1000, 1500)
return v

I'd still need some kind of file randomizer to make the test a blind comparison, though. Hmm... I could create a script that wrote the .avs with the "a" and "b" in StackVertical randomly ordered each time. Then it could execute the .avs, and maybe even prompt me for my ratings at the end.

I'm creating way too much work for myself.  Anyway, right now I have more urgent tasks, like figuring out who gets my vote on Super Tuesday!

Complete Video Evaluation Framework

Reply #12
I don't see why this is an issue with RV10 only (someone else has reported this too).

The application reuses the graphs it builds since the clips are playing twice so it's kind of silly to re-create them from scratch.

I think the RV people should shed some light to the issue with their directshow filter and not suggest alternative methods   

Complete Video Evaluation Framework

Reply #13
I am sorry, but it's not our Directshow filter

I apologize for not having tried the Video Quality Studio, so I don't really know what the original problem is. I just explained how I usually compare two videos to work around the problem of having only one overlay, which as we all know affects the video quality so much that it prevents accurate comparisons unless both videos either have overlay, or do not have overlay. I will add a VQS  test to my Todo list 

Quote
I don't see why this is an issue with RV10 only (someone else has reported this too).

Exactly what is the problem with RV10 in VQS?

One has to realize that the DirectShowSource method in Avisynth is problematic, since RM files can have variable framerate. Specifically, when DropDupe is used, frames are dropped, and do not exist in the stream. The DirectShowSource fps approximation will in many cases not work very well.
Sr. Codec Engineer (video) | RealNetworks Codec Group | helixcommunity.org 
This information is provided "AS IS" with no warranties,  grants no rights, and reflects my personal opinion.

Complete Video Evaluation Framework

Reply #14
Quote
I apologize for not having tried the Video Quality Studio, so I don't really know what the original problem is.

Quote
I don't see why this is an issue with RV10 only (someone else has reported this too).

Exactly what is the problem with RV10 in VQS?

One has to realize that the DirectShowSource method in Avisynth is problematic, since RM files can have variable framerate.

That's ok, I can't expect everyone to have tried my tool  But do give it a shot, it would be good to know what you think.

I haven't tried this case myself but this is what appears to be happening.

Once you start a test, the graph for all the files is constructed. Once the files are played, the graph is ran and once they are finished, it is seeked to the start of the video so it can be played again in the next iteration.

If you're not familiar with DSCQS, have a read at http://www.everwicked.com/content/QualityMetrics/ , it is like [A] {B} [A] {B} [Vote] where A, B are the video samples under test.

Now, people are reporting that the colors are distorted in some cases in the second clip and I have the feeling someone else reported it only happens with RV10 but I can't find where that was posted right now, so I might be wrong.

So what I think the problem is that some directshow filters are somehow locking the overlay and no other graph can use it. Again, I might be wrong here. I have nothing against any codec or any person but that's what I think with the amount of knowledge I have on directshow. Someone with greater knowledge will hopefully be able to correct me

PS: The graph's video window and other resources are detached whilst not playing the clip.

I wish I had some time to run tests here but my final year project is effectively consuming any time that I do not spend to sleep or eat 

Also, I could write a RealVideo specific input plugin if the Real SDK licence did not disallow you to make players that will play other formats than Real. At least that's the way it was last time I checked, let me know if this has changed 

Complete Video Evaluation Framework

Reply #15
Quote
Now, people are reporting that the colors are distorted in some cases in the second clip and I have the feeling someone else reported it only happens with RV10 but I can't find where that was posted right now, so I might be wrong.

I just tried using two Divx5 clips, and they didn't play at all. I just got four black screens in a row, and then a voting prompt. I need to test these clips, and my original problem clips, on another computer or two, because right now I suspect my own computer's configuration is at fault. I'm using Divx5.1.1, RV10 Elysian 2-24-04, and whatever RealPlayer was available for download last week, plus the decoder enhancements provided with Elysian.

Complete Video Evaluation Framework

Reply #16
Oh my... This sort of posts are gonna make me kill myself! 

Can I have a few seconds of those clips please?

Complete Video Evaluation Framework

Reply #17
Hey, what happened here? Did I scare everyone off with my last message?