From b196cadf9f9f6ea210db9236c2b26523a9a2719f Mon Sep 17 00:00:00 2001 From: Niklas Haas Date: Mon, 17 Jul 2017 21:39:06 +0200 Subject: vo_opengl: support HDR peak detection This is done via compute shaders. As a consequence, the tone mapping algorithms had to be rewritten to compute their known constants in GLSL (ahead of time), instead of doing it once. Didn't affect performance. Using shmem/SSBO atomics in this way is extremely fast on nvidia, but it might be slow on other platforms. Needs testing. Unfortunately, setting up the SSBO still requires OpenGL calls, which means I can't have it in video_shaders.c, where it belongs. But I'll defer worrying about that until the backend refactor, since then I'll be breaking up the video/video_shaders structure anyway. --- DOCS/man/options.rst | 8 ++++++++ 1 file changed, 8 insertions(+) (limited to 'DOCS/man/options.rst') diff --git a/DOCS/man/options.rst b/DOCS/man/options.rst index e73ef6eee4..0f59392feb 100644 --- a/DOCS/man/options.rst +++ b/DOCS/man/options.rst @@ -4752,6 +4752,14 @@ The following video options are currently all specific to ``--vo=opengl`` and linear Specifies the scale factor to use while stretching. Defaults to 1.0. +``--hdr-compute-peak`` + Compute the HDR peak per-frame of relying on tagged metadata. These values + are averaged over local regions as well as over several frames to prevent + the value from jittering around too much. This option basically gives you + dynamic, per-scene tone mapping. Requires compute shaders, which is a + fairly recent OpenGL feature, and will probably also perform horribly on + some drivers, so enable at your own risk. + ``--tone-mapping-desaturate=`` Apply desaturation for highlights that exceed this level of brightness. The higher the parameter, the more color information will be preserved. This -- cgit v1.2.3