Distraction Free Rendering
The stills below are from the filtering talk I gave a while back.
Showing a journey through some folded fractal space.
They are captures from a real-time tracer running at over 120 Hz at 1080p on a 2.8 Tflop/s laptop GPU.
The images have better than 1080p resolution phase response,
meaning you can localize the position of an edge to a sub-pixel position.
However the images have lower than 1080p frequency response,
meaning the images look substantially sharper when viewed at half resolution.
This is a direct result of the following process,
Each frame traces 960x540 rays in a temporally changing low-discrepancy pattern.
The rays also sample their prior color as reprojected from the prior 1920x1080 frame.
Finaly the {new,old} output at low-descrepency 960x540, is reconstructed at regular grid 1920x1080.
As you move through the space in real-time,
the output feels like a window into another world.
There is nothing in the visual output which breaks immersion.
It makes no difference that in terms of sharpness this could only qualify as a 480x270 image
if the evaluation metric was to match the visual sharpness of bitmap text.
The critical element is a total lack of any kind of temporal aliasing,
mixed with sub-pixel precision for localizing all edge gradients.