summaryrefslogtreecommitdiffstats
path: root/video/out/opengl/hwdec_vaglx.c
Commit message (Collapse)AuthorAgeFilesLines
* vo_opengl: refactor how hwdec interop exports textureswm42016-05-101-7/+15
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Rename gl_hwdec_driver.map_image to map_frame, and let it fill out a struct gl_hwdec_frame describing the exact texture layout. This gives more flexibility to what the hwdec interop can export. In particular, it can export strange component orders/permutations and textures with padded size. (The latter originating from cropped video.) The way gl_hwdec_frame works is in the spirit of the rest of the vo_opengl video processing code, which tends to put as much information in immediate state (as part of the dataflow), instead of declaring it globally. To some degree this duplicates the texplane and img_tex structs, but until we somehow unify those, it's better to give the hwdec state its own struct. The fact that changing the hwdec struct would require changes and testing on at least 4 platform/GPU combinations makes duplicating it almost a requirement to avoid pain later. Make gl_hwdec_driver.reinit set the new image format and remove the gl_hwdec.converted_imgfmt field. Likewise, gl_hwdec.gl_texture_target is replaced with gl_hwdec_plane.gl_target. Split out a init_image_desc function from init_format. The latter is not called in the hwdec case at all anymore. Setting up most of struct texplane is also completely separate in the hwdec and normal cases. video.c does not check whether the hwdec "mapped" image format is supported. This should not really happen anyway, and if it does, the hwdec interop backend must fail at creation time, so this is not an issue.
* video: refactor how VO exports hwdec device handleswm42016-05-091-3/+4
| | | | | | | | | | | | | | | | | | | | | | | | | | The main change is with video/hwdec.h. mp_hwdec_info is made opaque (and renamed to mp_hwdec_devices). Its accessors are mainly thread-safe (or documented where not), which makes the whole thing saner and cleaner. In particular, thread-safety rules become less subtle and more obvious. The new internal API makes it easier to support multiple OpenGL interop backends. (Although this is not done yet, and it's not clear whether it ever will.) This also removes all the API-specific fields from mp_hwdec_ctx and replaces them with a "ctx" field. For d3d in particular, we drop the mp_d3d_ctx struct completely, and pass the interfaces directly. Remove the emulation checks from vaapi.c and vdpau.c; they are pointless, and the checks that matter are done on the VO layer. The d3d hardware decoders might slightly change behavior: dxva2-copy will not use the VO device anymore if the VO supports proper interop. This pretty much assumes that any in such cases the VO will not use any form of exclusive mode, which makes using the VO device in copy mode unnecessary. This is a big refactor. Some things may be untested and could be broken.
* vo_opengl: hwdec: use IDs for API, and log which backend is usedwm42016-02-011-1/+2
| | | | | | | Since there can be multiple backends for a single API (vaapi can use GLX or EGL), not logging the exact backend name is annoying. So add it. At the same time, there is no need to duplicate the name as used by the --hwdec options, so replace it with using the numeric hwdec API ID.
* vo_opengl: never load vaapi GLX interop by defaultwm42015-11-091-4/+4
| | | | | | | Causes more harm than it helps. Will eventually be removed. Also rename the "reject_emulated" field to "probing" - this is more appropriate now.
* vaapi: remove dependency on X11wm42015-09-271-0/+1
| | | | | | | | | | | | | There are at least 2 ways of using VAAPI without X11 (Wayland, DRM). Remove the X11 requirement from the decoder part and the EGL interop. This will be used by a following commit, which adds Wayland support. The worst about this is the decoder part, which includes a bad hack for using the decoder without any VO interop (also known as "vaapi-copy" mode). Separate the X11 parts so that they're self-contained. For the EGL interop code we do something similar (it's kept slightly simpler, because it essentially only has to translate between our silly MPGetNativeDisplay abstraction and the vaGetDisplay...() call).
* vo_opengl: actually set hardware decoder mapped texture formatwm42015-09-241-1/+1
| | | | | | | | | | | Surfaces used by hardware decoding formats can be mapped exactly like a specific software pixel format, e.g. RGBA or NV12. p->image_params is supposed to be set to this format, but it wasn't. (How did this ever work?) Also, setting params->imgfmt in the hwdec interop drivers is pointless and redundant. (Change them to asserts, because why not.)
* vo_opengl: remove gl_ prefixes from files in video/out/openglNiklas Haas2015-09-091-0/+202
This is a bit redundant with the name of the directory itself, and not in line with existing naming conventions.