From d78bde15ab4be5f46a6fb5fc5a35d6acbc6c39cf Mon Sep 17 00:00:00 2001 From: wm4 Date: Thu, 27 Dec 2012 18:07:37 +0100 Subject: vo_opengl_old: reject 9-15 bit formats if textures have less than 16 bit For 9-15 bit material, cutting off the lower bits leads to significant quality reduction, because these formats leave the most significant bits unused (e.g. 10 bit padded to 16 bit, transferred as 8 bit -> only 2 bits left). 16 bit formats still can be played like this, as cutting the lower bits merely reduces quality in this case. This problem was encountered with the following GPU/driver combination: OpenGL vendor string: Intel Open Source Technology Center OpenGL renderer string: Mesa DRI Intel(R) 915GM x86/MMX/SSE2 OpenGL version string: 1.4 Mesa 9.0.1 It appears 16 bit support is rather common on GPUs, so testing the actual texture depth wasn't needed until now. (There are some other Mesa GPU/driver combinations which support 16 bit only when using RG textures instead of LUMINANCE_ALPHA. This is due to OpenGL driver bugs.) --- video/out/gl_common.c | 1 + 1 file changed, 1 insertion(+) (limited to 'video/out/gl_common.c') diff --git a/video/out/gl_common.c b/video/out/gl_common.c index 00e21ff312..42d035337f 100644 --- a/video/out/gl_common.c +++ b/video/out/gl_common.c @@ -364,6 +364,7 @@ struct gl_functions gl_functions[] = { DEF_FN_HARD(DrawArrays), DEF_FN_HARD(GetString), DEF_FN_HARD(GetError), + DEF_FN_HARD(GetTexLevelParameteriv), {0} }, }, -- cgit v1.2.3