From a18dc01655b86de040b5a3a02ffcad694b843b17 Mon Sep 17 00:00:00 2001 From: wm4 Date: Sun, 5 Apr 2015 22:44:22 +0200 Subject: vaapi: fight with Intel's broken video decoding GL interop Use texture-from-pixmap instead of vaapi's "native" GLX support. Apparently the latter is unused by other projects. Possibly it's broken due that, and Intel's inability to provide anything non-broken in relation to video. The new code basically uses the X11 output method on a in-memory pixmap, and maps this pixmap as texture using standard GLX mechanisms. This requires a lot of X11 and GLX boilerplate, so the code grows. (I don't know why libva's GLX interop doesn't just do the same under the hood, instead of bothering the world with their broken/unmaintained "old" method, whatever it did. I suspect that Intel programmers are just genuine sadists.) This change was suggested in issue #1765. The old GLX support is removed, as it's redundant and broken anyway. One remaining issue is that the first vaPutSurface() call fails with an unknown error. It returns -1, which is pretty strange, because vaapi error codes are normally positive. It happened with the old GLX code too, but does not happen with vo_vaapi. I couldn't find out why. --- wscript | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) (limited to 'wscript') diff --git a/wscript b/wscript index bba7e3996d..4cf92d977d 100644 --- a/wscript +++ b/wscript @@ -626,7 +626,7 @@ video_output_features = [ 'name': '--vaapi-glx', 'desc': 'VAAPI GLX', 'deps': [ 'vaapi', 'gl-x11' ], - 'func': check_pkg_config('libva-glx', '>= 0.32.0'), + 'func': check_true, }, { 'name': '--caca', 'desc': 'CACA', -- cgit v1.2.3