From d78bde15ab4be5f46a6fb5fc5a35d6acbc6c39cf Mon Sep 17 00:00:00 2001 From: wm4 Date: Thu, 27 Dec 2012 18:07:37 +0100 Subject: vo_opengl_old: reject 9-15 bit formats if textures have less than 16 bit For 9-15 bit material, cutting off the lower bits leads to significant quality reduction, because these formats leave the most significant bits unused (e.g. 10 bit padded to 16 bit, transferred as 8 bit -> only 2 bits left). 16 bit formats still can be played like this, as cutting the lower bits merely reduces quality in this case. This problem was encountered with the following GPU/driver combination: OpenGL vendor string: Intel Open Source Technology Center OpenGL renderer string: Mesa DRI Intel(R) 915GM x86/MMX/SSE2 OpenGL version string: 1.4 Mesa 9.0.1 It appears 16 bit support is rather common on GPUs, so testing the actual texture depth wasn't needed until now. (There are some other Mesa GPU/driver combinations which support 16 bit only when using RG textures instead of LUMINANCE_ALPHA. This is due to OpenGL driver bugs.) --- video/out/gl_common.h | 2 ++ 1 file changed, 2 insertions(+) (limited to 'video/out/gl_common.h') diff --git a/video/out/gl_common.h b/video/out/gl_common.h index de893966df..4afc192343 100644 --- a/video/out/gl_common.h +++ b/video/out/gl_common.h @@ -318,6 +318,8 @@ struct GL { void (GLAPIENTRY *EnableClientState)(GLenum); void (GLAPIENTRY *DisableClientState)(GLenum); GLenum (GLAPIENTRY *GetError)(void); + void (GLAPIENTRY *GetTexLevelParameteriv)(GLenum, GLint, GLenum, GLint *); + void (GLAPIENTRY *GenBuffers)(GLsizei, GLuint *); void (GLAPIENTRY *DeleteBuffers)(GLsizei, const GLuint *); -- cgit v1.2.3