Skip to main content

Notice

Please note that most of the software linked on this forum is likely to be safe to use. If you are unsure, feel free to ask in the relevant topics, or send a private message to an administrator or moderator. To help curb the problems of false positives, or in the event that you do find actual malware, you can contribute through the article linked here.
Topic: Quantization Grid (Read 39587 times) previous topic - next topic
0 Members and 1 Guest are viewing this topic.

Quantization Grid

Reply #101
My text books on the subject speak clearly about quantization error, so I reject KMD's claim to the contrary. Maybe the problem has to do with glancing at pictures instead of reading the text and equations?

KMD has posted this claim before.  When I asked him which textbook exactly he was referring to, he declined to answer.  I think he should either cite his source, or stop arguing about it.


 

Quantization Grid

Reply #103
The 8-bit signal (which include quantisation error), but analysed at the end of this process...
http://www.hydrogenaudio.org/forums/index....st&p=790120
...which was to show that individual quantisation steps can kind-of survive reconstruction filtering. I guess it also shows that dither "works", though there is no need for those extra steps if that's what you want to prove.

Cheers,
David.

Quantization Grid

Reply #104
The quantization grid is not caused by sampling it is caused by the interaction of quantization levels  with  sampling points. It may not be perceptable but there is no doubt that a waveform created from  a digital  file must be formed from a selection of points that are selected from a finite number of pre- determined regularly spaced co-ordinates. The digital file is formed from regularly spaced sampling points and reguarly spaced quantization levels therefore the waveform derived from it must have a coresponding regularity.


I work more in commercial TV where video has been digitized since the mid '70s. If what you're describing existed you would not be able to to display diagonal lines, particularly nearly vertical ones. I assure that is not the case. The time resolution is infinitely variable. OK, I can only measure reliably to a nanosecond but for all practical purposes....



Quantization Grid

Reply #105
I work more in commercial TV where video has been digitized since the mid '70s. If what you're describing existed you would not be able to to display diagonal lines, particularly nearly vertical ones. I assure that is not the case.
Actually, images and video resist the use of "ideal" filters, so aliasing is quite common. Those diagonal lines are often quite "steppy".

Cheers,
David.

Quantization Grid

Reply #106
Actually, images and video resist the use of "ideal" filters, so aliasing is quite common. Those diagonal lines are often quite "steppy".


Surely that is because most visual output devices have the same or lower resolution than/as the input signal?

Quantization Grid

Reply #107
Actually, images and video resist the use of "ideal" filters, so aliasing is quite common. Those diagonal lines are often quite "steppy".


Surely that is because most visual output devices have the same or lower resolution than/as the input signal?
No, it's because you can't use a sinc filter with a picture because you'll introduce visible ringing.

Full HD TV, 1:1 pixel mapped LCD - two examples of matched resolution to source - none have any filter on the output.

Similar problems with image capture for HD video.

Lesser problems these days for still images, because the lens itself often acts as a low pass filter for the x-mega-pixel sensor.

But you can't cleanly get "close" to the nyquist limit with images - not as close as you can with audio.

Cheers,
David.

Quantization Grid

Reply #108
Lesser problems these days for still images, because the lens itself often acts as a low pass filter for the x-mega-pixel sensor.
you mean the lens on the sensor within the chip or the main lens? I think there are filters within the sensor. I have a vague recollection in my head, but I cannot remember what does filters and lenses do on the sensor.

Quantization Grid

Reply #109
Lesser problems these days for still images, because the lens itself often acts as a low pass filter for the x-mega-pixel sensor.
you mean the lens on the sensor within the chip or the main lens? I think there are filters within the sensor. I have a vague recollection in my head, but I cannot remember what does filters and lenses do on the sensor.
There is usually a filter in front of the sensor. But you can't do the "flat to within a few percent of nyquist / kill everything above nyquist" response we're used to in audio.

Some people think they can do better than is usually achieved these days...
http://www.imaging.org/ist/publications/re...1_MA_7876_4.pdf

Cheers,
David.

Quantization Grid

Reply #110
Lesser problems these days for still images, because the lens itself often acts as a low pass filter for the x-mega-pixel sensor.
you mean the lens on the sensor within the chip or the main lens? I think there are filters within the sensor. I have a vague recollection in my head, but I cannot remember what does filters and lenses do on the sensor.


A lens is a low pass filter, so in practice it serves as the anti-aliasing filter if nothing else.  Likewise, the pixels on a camera tend to be quite wide relative to their pitch so they'll integrate over a finite width and thus further lowpass the signal. 

Often though some degree of aliasing is tolerated in imaging systems because its difficult to build optical filters with steep drop offs.  Its quite common to design imaging systems with the spot size of the optics matched to 2x the pixel size, and then to just let the finite pixel width low pass out much of the frequencies that would alias.