From ef6bc8504a945eb6492b8ed46fd5a1afaaf32182 Mon Sep 17 00:00:00 2001 From: Niklas Haas Date: Sun, 17 May 2020 03:17:28 +0200 Subject: vo_gpu: reinterpret SDR white levels based on ITU-R BT.2408 This standard says we should use a value of 203 nits instead of 100 for mapping between SDR and HDR. Code copied from https://code.videolan.org/videolan/libplacebo/-/commit/9d9164773 In particular, that commit also includes a test case to make sure the implementation doesn't break roundtrips. Relevant to #4248 and #7357. --- DOCS/man/options.rst | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) (limited to 'DOCS/man/options.rst') diff --git a/DOCS/man/options.rst b/DOCS/man/options.rst index 769db4d837..4850439a8a 100644 --- a/DOCS/man/options.rst +++ b/DOCS/man/options.rst @@ -6005,12 +6005,12 @@ The following video options are currently all specific to ``--vo=gpu`` and additional effect of parametrizing the inverse OOTF, in order to get colorimetrically consistent results with the mastering display. For SDR, or when using an ICC (profile (``--icc-profile``), setting this to a value - above 100 essentially causes the display to be treated as if it were an HDR + above 203 essentially causes the display to be treated as if it were an HDR display in disguise. (See the note below) In ``auto`` mode (the default), the chosen peak is an appropriate value - based on the TRC in use. For SDR curves, it uses 100. For HDR curves, it - uses 100 * the transfer function's nominal peak. + based on the TRC in use. For SDR curves, it uses 203. For HDR curves, it + uses 203 * the transfer function's nominal peak. .. note:: -- cgit v1.2.3