From 3f1bc25d4de6150b0acff7e92d3e3084a7d989f0 Mon Sep 17 00:00:00 2001 From: Niklas Haas Date: Fri, 4 Jan 2019 16:46:38 +0100 Subject: vo_gpu: use dB units for scene change detection Rather than the linear cd/m^2 units, these (relative) logarithmic units lend themselves much better to actually detecting scene changes, especially since the scene averaging was changed to also work logarithmically. --- DOCS/man/options.rst | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) (limited to 'DOCS') diff --git a/DOCS/man/options.rst b/DOCS/man/options.rst index e5a897ba4f..2e15106bf0 100644 --- a/DOCS/man/options.rst +++ b/DOCS/man/options.rst @@ -5264,9 +5264,9 @@ The following video options are currently all specific to ``--vo=gpu`` and to excessive flicker. (In signal theory terms, this controls the time constant "tau" of an IIR low pass filter) -``--hdr-scene-threshold-low=<0..10000>``, ``--hdr-scene-threshold-high=<0..10000>`` - The lower and upper thresholds (in cd/m^2) for a brightness difference to - be considered a scene change (default: 50 low, 200 high). This is only +``--hdr-scene-threshold-low=<0.0..100.0>``, ``--hdr-scene-threshold-high=<0.0..100.0>`` + The lower and upper thresholds (in dB) for a brightness difference + to be considered a scene change (default: 5.5 low, 10.0 high). This is only relevant when ``--hdr-compute-peak`` is enabled. Normally, small fluctuations in the frame brightness are compensated for by the peak averaging mechanism, but for large jumps in the brightness this can result -- cgit v1.2.3