From d6d6da4711ca1ad20e14386c4b29a955eb32322d Mon Sep 17 00:00:00 2001 From: Oliver Freyermuth Date: Mon, 10 Dec 2018 22:09:54 +0100 Subject: stream_dvb: Correct range for dvbin-card option. Adapt documentation accordingly and also, fix an off-by-one check in the code. closes #6371 --- DOCS/man/options.rst | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) (limited to 'DOCS') diff --git a/DOCS/man/options.rst b/DOCS/man/options.rst index 2aaad5febc..71c417990c 100644 --- a/DOCS/man/options.rst +++ b/DOCS/man/options.rst @@ -4057,8 +4057,8 @@ Network DVB --- -``--dvbin-card=<1-4>`` - Specifies using card number 1-4 (default: 1). +``--dvbin-card=<0-15>`` + Specifies using card number 0-15 (default: 0). ``--dvbin-file=`` Instructs mpv to read the channels list from ````. The default is -- cgit v1.2.3 From c681fc133c6b9ae3d8a5f462927950516624c11d Mon Sep 17 00:00:00 2001 From: Benjamin Barenblat Date: Tue, 18 Dec 2018 15:28:53 -0500 Subject: DOCS/man: update man pages to describe ReplayGain fallback Describe ReplayGain album-to-track fallback behavior introduced in commits e392d6610d1e35cc0190c794c151211b0aae83e6 and be90f2c8dd0431e252e43d5249e89446309113af. --- DOCS/man/options.rst | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) (limited to 'DOCS') diff --git a/DOCS/man/options.rst b/DOCS/man/options.rst index 71c417990c..a675d9259d 100644 --- a/DOCS/man/options.rst +++ b/DOCS/man/options.rst @@ -1326,8 +1326,10 @@ Audio Since mpv 0.18.1, this always controls the internal mixer (aka "softvol"). ``--replaygain=`` - Adjust volume gain according to the track-gain or album-gain replaygain - value stored in the file metadata (default: no replaygain). + Adjust volume gain according to replaygain values stored in the file + metadata. With ``--replaygain=no`` (the default), perform no adjustment. + With ``--replaygain=track``, apply track gain. With ``--replaygain=album``, + apply album gain if present and fall back to track gain otherwise. ``--replaygain-preamp=`` Pre-amplification gain in dB to apply to the selected replaygain gain -- cgit v1.2.3 From 94d35627f55c7ee7601c476b4b79e1f3c2eca83b Mon Sep 17 00:00:00 2001 From: Kotori Itsuka Date: Fri, 18 Jan 2019 11:24:38 +1000 Subject: DOCS/options.rst: update target-peak description List auto as an option for target-peak, and state that auto is its default operation. --- DOCS/man/options.rst | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) (limited to 'DOCS') diff --git a/DOCS/man/options.rst b/DOCS/man/options.rst index a675d9259d..6108e07c25 100644 --- a/DOCS/man/options.rst +++ b/DOCS/man/options.rst @@ -5144,7 +5144,7 @@ The following video options are currently all specific to ``--vo=gpu`` and The user should independently guarantee this before using these signal formats for display. -``--target-peak=`` +``--target-peak=`` Specifies the measured peak brightness of the output display, in cd/m^2 (AKA nits). The interpretation of this brightness depends on the configured ``--target-trc``. In all cases, it imposes a limit on the signal values @@ -5156,9 +5156,9 @@ The following video options are currently all specific to ``--vo=gpu`` and above 100 essentially causes the display to be treated as if it were an HDR display in disguise. (See the note below) - By default, the chosen peak defaults to an appropriate value based on the - TRC in use. For SDR curves, it defaults to 100. For HDR curves, it - defaults to 100 * the transfer function's nominal peak. + In ``auto`` mode (the default), the chosen peak is an appropriate value + based on the TRC in use. For SDR curves, it uses 100. For HDR curves, it + uses 100 * the transfer function's nominal peak. .. note:: -- cgit v1.2.3 From 6ce570359aa06469d3ead822227058ec87c86b30 Mon Sep 17 00:00:00 2001 From: Akemi Date: Wed, 26 Sep 2018 15:33:34 +0200 Subject: cocoa-cb: add support for VOCTRL_GET_DISPLAY_NAMES --- DOCS/man/input.rst | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) (limited to 'DOCS') diff --git a/DOCS/man/input.rst b/DOCS/man/input.rst index 2fcf6857e7..f55eea1409 100644 --- a/DOCS/man/input.rst +++ b/DOCS/man/input.rst @@ -1630,7 +1630,9 @@ Property list are the xrandr names (LVDS1, HDMI1, DP1, VGA1, etc.). On Windows, these are the GDI names (\\.\DISPLAY1, \\.\DISPLAY2, etc.) and the first display in the list will be the one that Windows considers associated with the - window (as determined by the MonitorFromWindow API.) + window (as determined by the MonitorFromWindow API.) On macOS these are the + Display Product Names as used in the System Information and only one display + name is returned since a window can only be on one screen. ``display-fps`` (RW) The refresh rate of the current display. Currently, this is the lowest FPS -- cgit v1.2.3 From 3dd59dbed06a55eed00ad68d0a953f39188e3647 Mon Sep 17 00:00:00 2001 From: Martin Herkt Date: Wed, 13 Feb 2019 02:43:57 +0100 Subject: options: do not enable MPEG2 hwdec by default Too many broken hardware decoders. Noticed wrong decoding of a video file encoded with x262 on RX Vega when using VAAPI (Mesa 18.3.2). Looks fine with swdec and a cheap hardware BD player. Reverts 017f3d0674e48a587b9e6cd7a48f15519c799c3e --- DOCS/man/options.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) (limited to 'DOCS') diff --git a/DOCS/man/options.rst b/DOCS/man/options.rst index 6108e07c25..c6b34f3171 100644 --- a/DOCS/man/options.rst +++ b/DOCS/man/options.rst @@ -1092,7 +1092,7 @@ Video You can get the list of allowed codecs with ``mpv --vd=help``. Remove the prefix, e.g. instead of ``lavc:h264`` use ``h264``. - By default, this is set to ``h264,vc1,wmv3,hevc,mpeg2video,vp9``. Note that + By default, this is set to ``h264,vc1,wmv3,hevc,vp9``. Note that the hardware acceleration special codecs like ``h264_vdpau`` are not relevant anymore, and in fact have been removed from Libav in this form. -- cgit v1.2.3 From 3fe882d4ae80fa060a71dad0d6d1605afcfe98b6 Mon Sep 17 00:00:00 2001 From: Niklas Haas Date: Thu, 27 Dec 2018 18:34:19 +0100 Subject: vo_gpu: improve tone mapping desaturation Instead of desaturating towards luma, we desaturate towards the per-channel tone mapped version. This essentially proves a smooth roll-off towards the "hollywood"-style (non-chromatic) tone mapping algorithm, which works better for bright content, while continuing to use the "linear" style (chromatic) tone mapping algorithm for primarily in-gamut content. We also split up the desaturation algorithm into strength and exponent, which allows users to use less aggressive desaturation settings without affecting the overall curve. --- DOCS/interface-changes.rst | 4 ++++ DOCS/man/options.rst | 31 ++++++++++++++++++++----------- 2 files changed, 24 insertions(+), 11 deletions(-) (limited to 'DOCS') diff --git a/DOCS/interface-changes.rst b/DOCS/interface-changes.rst index cbc9af18f8..7e723b9dbe 100644 --- a/DOCS/interface-changes.rst +++ b/DOCS/interface-changes.rst @@ -47,6 +47,10 @@ Interface changes - support for `--spirv-compiler=nvidia` has been removed, leaving `shaderc` as the only option. The `--spirv-compiler` option itself has been marked as deprecated, and may be removed in the future. + - split up `--tone-mapping-desaturate`` into strength + exponent, instead of + only using a single value (which previously just controlled the exponent). + The strength now linearly blends between the linear and nonlinear tone + mapped versions of a color. --- mpv 0.29.0 --- - drop --opensles-sample-rate, as --audio-samplerate should be used if desired - drop deprecated --videotoolbox-format, --ff-aid, --ff-vid, --ff-sid, diff --git a/DOCS/man/options.rst b/DOCS/man/options.rst index c6b34f3171..1c08917d7a 100644 --- a/DOCS/man/options.rst +++ b/DOCS/man/options.rst @@ -5245,17 +5245,26 @@ The following video options are currently all specific to ``--vo=gpu`` and The special value ``auto`` (default) will enable HDR peak computation automatically if compute shaders and SSBOs are supported. -``--tone-mapping-desaturate=`` - Apply desaturation for highlights. The parameter essentially controls the - steepness of the desaturation curve. The higher the parameter, the more - aggressively colors will be desaturated. This setting helps prevent - unnaturally blown-out colors for super-highlights, by (smoothly) turning - into white instead. This makes images feel more natural, at the cost of - reducing information about out-of-range colors. - - The default of 0.5 provides a good balance. This value is weaker than the - ACES ODT curves' recommendation, but works better for most content in - practice. A setting of 0.0 disables this option. +``--tone-mapping-desaturate=<0.0..1.0>`` + Apply desaturation for highlights (default: 0.75). The parameter controls + the strength of the desaturation curve. A value of 0.0 completely disables + it, while a value of 1.0 means that overly bright colors will tend towards + white. (This is not always the case, especially not for highlights that are + near primary colors) + + Values in between apply progressively more/less aggressive desaturation. + This setting helps prevent unnaturally oversaturated colors for + super-highlights, by (smoothly) turning them into less saturated (per + channel tone mapped) colors instead. This makes images feel more natural, + at the cost of chromatic distortions for out-of-range colors. The default + value of 0.75 provides a good balance. Setting this to 0.0 preserves the + chromatic accuracy of the tone mapping process. + +``--tone-mapping-desaturate-exponent=<0.0..20.0>`` + This setting controls the exponent of the desaturation curve, which + controls how bright a color needs to be in order to start being + desaturated. The default of 1.5 provides a reasonable balance. Decreasing + this exponent makes the curve more aggressive. ``--gamut-warning`` If enabled, mpv will mark all clipped/out-of-gamut pixels that exceed a -- cgit v1.2.3 From 6179dcbb798aa9e3501af82ae46975e881d80626 Mon Sep 17 00:00:00 2001 From: Niklas Haas Date: Tue, 1 Jan 2019 07:30:00 +0100 Subject: vo_gpu: redesign peak detection algorithm The previous approach of using an FIR with tunable hard threshold for scene changes had several problems: - the FIR involved annoying hard-coded buffer sizes, high VRAM usage, and the FIR sum was prone to numerical overflow which limited the number of frames we could average over. We also totally redesign the scene change detection. - the hard scene change detection was prone to both false positives and false negatives, each with their own (annoying) issues. Scrap this entirely and switch to a dual approach of using a simple single-pole IIR low pass filter to smooth out noise, while using a softer scene change curve (with tunable low and high thresholds), based on `smoothstep`. The IIR filter is extremely simple in its implementation and has an arbitrarily user-tunable cutoff frequency, while the smoothstep-based scene change curve provides a good, tunable tradeoff between adaptation speed and stability - without exhibiting either of the traditional issues associated with the hard cutoff. Another way to think about the new options is that the "low threshold" provides a margin of error within which we don't care about small fluctuations in the scene (which will therefore be smoothed out by the IIR filter). --- DOCS/interface-changes.rst | 1 + DOCS/man/options.rst | 24 ++++++++++++++++++++++++ 2 files changed, 25 insertions(+) (limited to 'DOCS') diff --git a/DOCS/interface-changes.rst b/DOCS/interface-changes.rst index 7e723b9dbe..ce7e33176a 100644 --- a/DOCS/interface-changes.rst +++ b/DOCS/interface-changes.rst @@ -51,6 +51,7 @@ Interface changes only using a single value (which previously just controlled the exponent). The strength now linearly blends between the linear and nonlinear tone mapped versions of a color. + - add --hdr-peak-decay-rate and --hdr-scene-threshold-low/high --- mpv 0.29.0 --- - drop --opensles-sample-rate, as --audio-samplerate should be used if desired - drop deprecated --videotoolbox-format, --ff-aid, --ff-vid, --ff-sid, diff --git a/DOCS/man/options.rst b/DOCS/man/options.rst index 1c08917d7a..0f7007bf89 100644 --- a/DOCS/man/options.rst +++ b/DOCS/man/options.rst @@ -5245,6 +5245,30 @@ The following video options are currently all specific to ``--vo=gpu`` and The special value ``auto`` (default) will enable HDR peak computation automatically if compute shaders and SSBOs are supported. +``--hdr-peak-decay-rate=<1.0..1000.0>`` + The decay rate used for the HDR peak detection algorithm (default: 100.0). + This is only relevant when ``--hdr-compute-peak`` is enabled. Higher values + make the peak decay more slowly, leading to more stable values at the cost + of more "eye adaptation"-like effects (although this is mitigated somewhat + by ``--hdr-scene-threshold``). A value of 1.0 (the lowest possible) disables + all averaging, meaning each frame's value is used directly as measured, + but doing this is not recommended for "noisy" sources since it may lead + to excessive flicker. (In signal theory terms, this controls the time + constant "tau" of an IIR low pass filter) + +``--hdr-scene-threshold-low=<0..10000>``, ``--hdr-scene-threshold-high=<0..10000>`` + The lower and upper thresholds (in cd/m^2) for a brightness difference to + be considered a scene change (default: 50 low, 200 high). This is only + relevant when ``--hdr-compute-peak`` is enabled. Normally, small + fluctuations in the frame brightness are compensated for by the peak + averaging mechanism, but for large jumps in the brightness this can result + in the frame remaining too bright or too dark for up to several seconds, + depending on the value of ``--hdr-peak-decay-rate``. To counteract this, + when the brightness between the running average and the current frame + exceeds the low threshold, mpv will make the averaging filter more + aggressive, up to the limit of the high threshold (at which point the + filter becomes instant). + ``--tone-mapping-desaturate=<0.0..1.0>`` Apply desaturation for highlights (default: 0.75). The parameter controls the strength of the desaturation curve. A value of 0.0 completely disables -- cgit v1.2.3 From 12e58ff8a65c537a222a3fb954f88d98a3a5bfd2 Mon Sep 17 00:00:00 2001 From: Niklas Haas Date: Wed, 2 Jan 2019 03:03:38 +0100 Subject: vo_gpu: allow boosting dark scenes when tone mapping In theory our "eye adaptation" algorithm works in both ways, both darkening bright scenes and brightening dark scenes. But I've always just prevented the latter with a hard clamp, since I wanted to avoid blowing up dark scenes into looking funny (and full of noise). But allowing a tiny bit of over-exposure might be a good thing. I won't change the default just yet (better let users test), but a moderate value of 1.2 might be better than the current 1.0 limit. Needs testing especially on dark scenes. --- DOCS/interface-changes.rst | 1 + DOCS/man/options.rst | 8 ++++++++ 2 files changed, 9 insertions(+) (limited to 'DOCS') diff --git a/DOCS/interface-changes.rst b/DOCS/interface-changes.rst index ce7e33176a..2fd30628d8 100644 --- a/DOCS/interface-changes.rst +++ b/DOCS/interface-changes.rst @@ -52,6 +52,7 @@ Interface changes The strength now linearly blends between the linear and nonlinear tone mapped versions of a color. - add --hdr-peak-decay-rate and --hdr-scene-threshold-low/high + - add --tone-mapping-max-boost --- mpv 0.29.0 --- - drop --opensles-sample-rate, as --audio-samplerate should be used if desired - drop deprecated --videotoolbox-format, --ff-aid, --ff-vid, --ff-sid, diff --git a/DOCS/man/options.rst b/DOCS/man/options.rst index 0f7007bf89..e5a897ba4f 100644 --- a/DOCS/man/options.rst +++ b/DOCS/man/options.rst @@ -5235,6 +5235,14 @@ The following video options are currently all specific to ``--vo=gpu`` and linear Specifies the scale factor to use while stretching. Defaults to 1.0. +``--tone-mapping-max-boost=<1.0..10.0>`` + Upper limit for how much the tone mapping algorithm is allowed to boost + the average brightness by over-exposing the image. The default value of 1.0 + allows no additional brightness boost. A value of 2.0 would allow + over-exposing by a factor of 2, and so on. Raising this setting can help + reveal details that would otherwise be hidden in dark scenes, but raising + it too high will make dark scenes appear unnaturally bright. + ``--hdr-compute-peak=`` Compute the HDR peak and frame average brightness per-frame instead of relying on tagged metadata. These values are averaged over local regions as -- cgit v1.2.3 From 3f1bc25d4de6150b0acff7e92d3e3084a7d989f0 Mon Sep 17 00:00:00 2001 From: Niklas Haas Date: Fri, 4 Jan 2019 16:46:38 +0100 Subject: vo_gpu: use dB units for scene change detection Rather than the linear cd/m^2 units, these (relative) logarithmic units lend themselves much better to actually detecting scene changes, especially since the scene averaging was changed to also work logarithmically. --- DOCS/man/options.rst | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) (limited to 'DOCS') diff --git a/DOCS/man/options.rst b/DOCS/man/options.rst index e5a897ba4f..2e15106bf0 100644 --- a/DOCS/man/options.rst +++ b/DOCS/man/options.rst @@ -5264,9 +5264,9 @@ The following video options are currently all specific to ``--vo=gpu`` and to excessive flicker. (In signal theory terms, this controls the time constant "tau" of an IIR low pass filter) -``--hdr-scene-threshold-low=<0..10000>``, ``--hdr-scene-threshold-high=<0..10000>`` - The lower and upper thresholds (in cd/m^2) for a brightness difference to - be considered a scene change (default: 50 low, 200 high). This is only +``--hdr-scene-threshold-low=<0.0..100.0>``, ``--hdr-scene-threshold-high=<0.0..100.0>`` + The lower and upper thresholds (in dB) for a brightness difference + to be considered a scene change (default: 5.5 low, 10.0 high). This is only relevant when ``--hdr-compute-peak`` is enabled. Normally, small fluctuations in the frame brightness are compensated for by the peak averaging mechanism, but for large jumps in the brightness this can result -- cgit v1.2.3 From 8f5a42b1a0764acd392a410ef21e95029352b01f Mon Sep 17 00:00:00 2001 From: Martin Herkt Date: Fri, 1 Mar 2019 12:39:12 +0100 Subject: options: do not enable WMV3 hwdec by default Crashes NVIDIA, probably buggy on others. No one ever tests this shit. See #2192 --- DOCS/man/options.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) (limited to 'DOCS') diff --git a/DOCS/man/options.rst b/DOCS/man/options.rst index 2e15106bf0..d57ffa88d3 100644 --- a/DOCS/man/options.rst +++ b/DOCS/man/options.rst @@ -1092,7 +1092,7 @@ Video You can get the list of allowed codecs with ``mpv --vd=help``. Remove the prefix, e.g. instead of ``lavc:h264`` use ``h264``. - By default, this is set to ``h264,vc1,wmv3,hevc,vp9``. Note that + By default, this is set to ``h264,vc1,hevc,vp9``. Note that the hardware acceleration special codecs like ``h264_vdpau`` are not relevant anymore, and in fact have been removed from Libav in this form. -- cgit v1.2.3 From e37c253b9207980a33ff3789b560efa3c4b6eb3e Mon Sep 17 00:00:00 2001 From: zc62 Date: Mon, 4 Mar 2019 05:46:35 -0500 Subject: lcms: allow infinite contrast Fixes #5980 --- DOCS/man/options.rst | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) (limited to 'DOCS') diff --git a/DOCS/man/options.rst b/DOCS/man/options.rst index d57ffa88d3..1f1c2138c9 100644 --- a/DOCS/man/options.rst +++ b/DOCS/man/options.rst @@ -5357,12 +5357,14 @@ The following video options are currently all specific to ``--vo=gpu`` and Size of the 3D LUT generated from the ICC profile in each dimension. Default is 64x64x64. Sizes may range from 2 to 512. -``--icc-contrast=<0-1000000>`` +``--icc-contrast=<0-1000000|inf>`` Specifies an upper limit on the target device's contrast ratio. This is detected automatically from the profile if possible, but for some profiles it might be missing, causing the contrast to be assumed as infinite. As a result, video may appear darker than intended. This only affects BT.1886 - content. The default of 0 means no limit. + content. The default of 0 means no limit if the detected contrast is less + than 100000, and limits to 1000 otherwise. Use ``--icc-contrast=inf`` to + preserve the infinite contrast (most likely when using OLED displays). ``--blend-subtitles=`` Blend subtitles directly onto upscaled video frames, before interpolation -- cgit v1.2.3