The article doesn’t offer an explanation as to why we’re seeing different results when running the same game across various devices. In addition, the screenshots are published in JPEG format which can also introduce noise.
Here are some possible explanations based on the images that are publicly available.
Every game uses textures which are digital representations of the surface of an object. Textures have 2D qualities (color and brightness) but also 3D properties (e.g. how transparent and reflective the object is). Once a texture has been wrapped around any three-dimensional object, you will have the result displayed below.
In mobile graphics, textures can occupy a lot of memory. To save storage space, memory footprint and bandwidth, companies have created lossy texture compression formats – think of them as JPEG versions of RAW images.
Mobile GPUs today support different texture compression formats (PVRTC, ASTC, ETC, ATITC etc.); for example, OpenGL ES 3.0 mandates ETC while ASTC is optional.
Using a sub-optimal texture compression format significantly influences the quality of images in games. To give you an idea of how texture compression formats rank in terms of quality, I’ve compared PVRTC and PVRTC2 with other commonly-used texture compression formats here and here.
The images above show how certain texture compression formats introduce noise; this is particularly visible for the ETC1 and BC3 standards.
Some games might natively render at a lower resolution then upscale to the screen’s default resolution; this might also introduce rendering artifacts.
In the three examples presented below, all devices render at different resolutions; both iPhones render at a higher resolution then scale the image down therefore producing better images.
The difference in quality is noticeable in the zoomed area around the bike.
A developer can request memory buffers from the GPU to store pixel information. GPUs support different precisions (RGB565, RGBA8888).
The level of precision developers use also influences image quality. RGB565 is considered good enough for certain kinds of rendering, but for the highest quality, developers should consider choosing something with more precision, such as RGBA8888.
You can definitely notice the dithering effects that occur when using RGB565 in the area around the building wall (to the right of the barrel).
Dithering can be noticed in the first image on the wall behind the barrel (click on the images to enlarge)*
The precision of a GPU’s ALUs also has an impact on image quality, along with the precision you store the final results in. The biggest distinction between today’s desktop and mobile GPUs is that on mobile you’re much more likely to find very low-power (but also lower precision) ALUs.
Developers are able to influence which ALUs are used by the GPU when running their shaders, forcing them to consider ALU precision as another possible factor in image quality.
YOUi Labs offers an Android app which tests the floating point accuracy of the GPU inside your device; here are some very noticeably different results obtained on two competing GPUs:
Supporting only FP16 ALU precision introduces noticeable artifacts (click on the image to enlarge)
Rendering results can also be influenced by various techniques or effects that developers choose to implement.
Looking at the screenshots above, enabling MSAA (multi-sample anti-aliasing) significantly improves the quality not only around the license plate area but also the image fidelity in the region of the cables.
Given that mobile devices are starting to push 2K or 4K resolutions, image quality will become more and more important. However, increasing precision or adopting a sub-optimal texture compression might have detrimental effects on power consumption.
This is why a well-balanced GPU architecture can make a huge difference when it comes to image quality in computer-generated graphics.
* Images courtesy of GameBench, all rights reserved.