summaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
authorNiklas Haas <git@haasn.dev>2023-09-20 18:27:23 +0200
committerNiklas Haas <github-daiK1o@haasn.dev>2023-09-20 20:26:07 +0200
commite7bd330ed073afe973dc9bdae669ac4a4a0017f2 (patch)
tree69f0fdfa73c516f003bcea4645a6e9164f530ec4
parentfe868988a18be500a440150a0d4d871687a095fc (diff)
downloadmpv-e7bd330ed073afe973dc9bdae669ac4a4a0017f2.tar.bz2
mpv-e7bd330ed073afe973dc9bdae669ac4a4a0017f2.tar.xz
vo_gpu: match libplacebo peak detection defaults
This probably makes `vo_gpu` tone mapping worse, or something, but who cares. The status quo for a while now has been to use `vo_gpu_next` if you care about HDR rendering at all. See-Also: haasn/libplacebo@ec60dd156b82753a2e2d8a399899244605f4d1bf See-Also: haasn/libplacebo@0903cbd05d7fc0391cbd99954924a39b855c8a1b
-rw-r--r--DOCS/interface-changes.rst2
-rw-r--r--DOCS/man/options.rst4
-rw-r--r--video/out/gpu/video.c6
3 files changed, 7 insertions, 5 deletions
diff --git a/DOCS/interface-changes.rst b/DOCS/interface-changes.rst
index bc00243d96..36d19b1407 100644
--- a/DOCS/interface-changes.rst
+++ b/DOCS/interface-changes.rst
@@ -69,6 +69,8 @@ Interface changes
- change `--dither-depth` to `auto`
- deprecate `--profile=gpu-hq`, add `--profile=<fast|high-quality>`
- change `--dscale` default to `hermite`
+ - update defaults to `--hdr-peak-decay-rate=20`, `--hdr-scene-threshold-low=1.0`,
+ `--hdr-scene-threshold-high=3.0`
--- mpv 0.36.0 ---
- add `--target-contrast`
- Target luminance value is now also applied when ICC profile is used.
diff --git a/DOCS/man/options.rst b/DOCS/man/options.rst
index d3effe79d2..a78a594faa 100644
--- a/DOCS/man/options.rst
+++ b/DOCS/man/options.rst
@@ -6735,7 +6735,7 @@ them.
come with a small performance penalty. (Only for ``--vo=gpu-next``)
``--hdr-peak-decay-rate=<0.0..1000.0>``
- The decay rate used for the HDR peak detection algorithm (default: 100.0).
+ The decay rate used for the HDR peak detection algorithm (default: 20.0).
This is only relevant when ``--hdr-compute-peak`` is enabled. Higher values
make the peak decay more slowly, leading to more stable values at the cost
of more "eye adaptation"-like effects (although this is mitigated somewhat
@@ -6747,7 +6747,7 @@ them.
``--hdr-scene-threshold-low=<0.0..100.0>``, ``--hdr-scene-threshold-high=<0.0..100.0>``
The lower and upper thresholds (in dB) for a brightness difference
- to be considered a scene change (default: 5.5 low, 10.0 high). This is only
+ to be considered a scene change (default: 1.0 low, 3.0 high). This is only
relevant when ``--hdr-compute-peak`` is enabled. Normally, small
fluctuations in the frame brightness are compensated for by the peak
averaging mechanism, but for large jumps in the brightness this can result
diff --git a/video/out/gpu/video.c b/video/out/gpu/video.c
index 3c8c6bae39..7dfea116b8 100644
--- a/video/out/gpu/video.c
+++ b/video/out/gpu/video.c
@@ -323,9 +323,9 @@ static const struct gl_video_opts gl_video_opts_def = {
.curve = TONE_MAPPING_AUTO,
.curve_param = NAN,
.max_boost = 1.0,
- .decay_rate = 100.0,
- .scene_threshold_low = 5.5,
- .scene_threshold_high = 10.0,
+ .decay_rate = 20.0,
+ .scene_threshold_low = 1.0,
+ .scene_threshold_high = 3.0,
.contrast_smoothness = 3.5,
},
.early_flush = -1,