They asked the developers of the various codecs what settings they should use, the respective developer's made the decisions.
![quicksync kaby lake quicksync kaby lake](https://pics.computerbase.de/8/1/0/4/4/4-1080.1008610039.png)
![quicksync kaby lake quicksync kaby lake](https://images.anandtech.com/doci/10959/1-11.png)
More importantly, if you look at the command lines used, for the "ripping" test, for x264 they used x264 -preset placebo -me umh -merange 32 -keyint infinite -tune ssim -pass 2 and for x265 they used -p veryslow -tune ssim and Intel's MSS gpu encoder still beat them in the SSIM metric!ĭid you ever think you would see the day when x264 + placebo + tune ssim would lose a test where SSIM is the metric? And I've never heard of the other encoder "Kingsoft", anybody know anything about them ? The thing is that subjective tests were also done, were they had various folks look at the encodes and pick which one they thought looked the best. If it's not clear, I can post some examples to demonstrateĮven so, that's great news for Intel. a higher SSIM or PSNR score does not necessarily correlate with what a human being "sees" as higher quality or more resembling the source. This is not an excuse, this is the main complaint when x264 (or any codec) supposedly "wins" testing in the past - those tests only show a limited subset, and are "tweaked" to score higher "points" but yet usually show lower visual quality. I agree with the faster, lower power consumption part - and that's a move in the right direction, but not necessarily the "quality" part based solely on SSIM testing. I'd love to see some accompanying subjective assessments It "fails" pretty badly in certain types of content, just like PSNR.
![quicksync kaby lake quicksync kaby lake](https://www.nag.co.za/wp-content/uploads/2017/09/acer-swift-7-trackmania-720p.png)
Unfortunately SSIM is only a moderately accurate predictor of "quality".