summaryrefslogtreecommitdiffstats
path: root/video/filter
diff options
context:
space:
mode:
authorwm4 <wm4@nowhere>2016-07-06 13:38:43 +0200
committerwm4 <wm4@nowhere>2016-07-06 13:38:43 +0200
commit0b1ef814986cac9b812a24fbf23252d6854f3c47 (patch)
tree94a5124ae1da19e036b581e73130e5d7f55b202c /video/filter
parentfc76966d9ee5e5ed19f0cf1566a7b94c9e1ac45b (diff)
downloadmpv-0b1ef814986cac9b812a24fbf23252d6854f3c47.tar.bz2
mpv-0b1ef814986cac9b812a24fbf23252d6854f3c47.tar.xz
video: fix deinterlace filter handling on pixel format changes
The test scenario at hand was hardware decoding a file with d3d11 and with deinterlacing enabled. The file switches to a non-hardware dedocdeable format mid-stream. This failed, because it tried to call vf_reconfig() with the old filters inserted, with was fatal due to vf_d3d11vpp accepting only hardware input formats. Fix this by always strictly removing all auto-inserted filters (including the deinterlacing one), and reconfiguring only after that. Note that this change is good for other situations too, because we generally don't want to use a hardware deinterlacer for software decoding by default. They're not necessarily optimal, and VAAPI VPP even has incomprehensible deinterlacer bugs specifically with software frames not coming from a hardware decoder.
Diffstat (limited to 'video/filter')
0 files changed, 0 insertions, 0 deletions