summaryrefslogtreecommitdiffstats
path: root/wscript
diff options
context:
space:
mode:
authorwm4 <wm4@nowhere>2015-04-05 22:44:22 +0200
committerwm4 <wm4@nowhere>2015-04-05 22:48:40 +0200
commita18dc01655b86de040b5a3a02ffcad694b843b17 (patch)
tree06b4cc290b134187805f8f8ead35259eb1fa95fc /wscript
parent20160fa2e13641acb4276ce68f7fad9025425b77 (diff)
downloadmpv-a18dc01655b86de040b5a3a02ffcad694b843b17.tar.bz2
mpv-a18dc01655b86de040b5a3a02ffcad694b843b17.tar.xz
vaapi: fight with Intel's broken video decoding GL interop
Use texture-from-pixmap instead of vaapi's "native" GLX support. Apparently the latter is unused by other projects. Possibly it's broken due that, and Intel's inability to provide anything non-broken in relation to video. The new code basically uses the X11 output method on a in-memory pixmap, and maps this pixmap as texture using standard GLX mechanisms. This requires a lot of X11 and GLX boilerplate, so the code grows. (I don't know why libva's GLX interop doesn't just do the same under the hood, instead of bothering the world with their broken/unmaintained "old" method, whatever it did. I suspect that Intel programmers are just genuine sadists.) This change was suggested in issue #1765. The old GLX support is removed, as it's redundant and broken anyway. One remaining issue is that the first vaPutSurface() call fails with an unknown error. It returns -1, which is pretty strange, because vaapi error codes are normally positive. It happened with the old GLX code too, but does not happen with vo_vaapi. I couldn't find out why.
Diffstat (limited to 'wscript')
-rw-r--r--wscript2
1 files changed, 1 insertions, 1 deletions
diff --git a/wscript b/wscript
index bba7e3996d..4cf92d977d 100644
--- a/wscript
+++ b/wscript
@@ -626,7 +626,7 @@ video_output_features = [
'name': '--vaapi-glx',
'desc': 'VAAPI GLX',
'deps': [ 'vaapi', 'gl-x11' ],
- 'func': check_pkg_config('libva-glx', '>= 0.32.0'),
+ 'func': check_true,
}, {
'name': '--caca',
'desc': 'CACA',