Thu. May 19th, 2022


It’s like nVidia’s DLSS.

The game renders at a lower resolution (increases performance).

The chip then applies some kind of “AI” (for DLSS) or analysis to guess what the higher resolution image would have looked like from that low-resolution footage. Usually it gets it about right.

It’s like watching an old movie from VHS which has been photoshopped to make it look HD. But it does it in real-time, on the chip, and doesn’t need the software to do that for it.

It’ll make things “faster” and without much degradation in image quality but if you were to look closely (which you wouldn’t on the Deck) then you may well notice some problems with the image.



Source link

By admin

Leave a Reply

Your email address will not be published.