summaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
-rw-r--r--DOCS/build-system.rst199
-rw-r--r--Makefile.new93
-rw-r--r--TOOLS/configure_common.py740
-rw-r--r--TOOLS/makefile_common.mak55
-rwxr-xr-xconfigure1019
5 files changed, 2106 insertions, 0 deletions
diff --git a/DOCS/build-system.rst b/DOCS/build-system.rst
new file mode 100644
index 0000000000..8b8f122e6d
--- /dev/null
+++ b/DOCS/build-system.rst
@@ -0,0 +1,199 @@
+Build system overview
+=====================
+
+mpv's new build system is based on Python and completely replaces the previous
+./waf build system.
+
+This file describes internals. See the README in the top level directory for
+user help.
+
+User help (to be moved to README.md)
+====================================
+
+Compiling with full features requires development files for several
+external libraries. Below is a list of some important requirements.
+
+For a list of the available build options use `./configure --help`. If
+you think you have support for some feature installed but configure fails to
+detect it, the file `build/config.log` may contain information about the
+reasons for the failure.
+
+NOTE: To avoid cluttering the output with unreadable spam, `--help` only shows
+one of the many switches for each option. If the option is autodetected by
+default, the `--disable-***` switch is printed; if the option is disabled by
+default, the `--enable-***` switch is printed. Either way, you can use
+`--enable-***` or `--disable-***` regardless of what is printed by `--help`.
+By default, most features are auto-detected. You can use ``--with-***=option``
+to get finer control over whether auto-detection is used for a feature.
+
+Example:
+
+ ./configure && make -j20
+
+If everything goes well, the mpv binary is created in the ``build`` directory.
+
+`make` alone can be used to rebuild parts of the player. On update, it's
+recommended to run `make dist-clean` and to rerun configure.
+
+See `./configure --help` for advanced usage.
+
+Motivation & Requirements
+=========================
+
+It's unclear what the fuck the author of the new build system was thinking.
+
+Big picture
+===========
+
+The configure script is written in Python. It generates config.h and config.mak
+files (and possibly more). By default these are written to a newly created
+"build" directory. It also writes a build.log.
+
+The "actual" build system is based on GNU make (other make variants probably
+won't work). The Makefile in the project root is manually created by the build
+system "user" (i.e. the mpv developers), and is fixed and not changed by
+configure. It includes the configured-generated build/config.mak file for the
+variable parts. It also includes Header file dependencies are handled
+automatically with the ``-MD`` option (which the compiler must support).
+
+For out-of-tree builds, a small Makefile is generated that includes the one
+from the source directory. Simply call configure from another directory.
+(Note: this is broken, fails at generated files, and is also ugly.)
+
+By default, it attempts not to write any build output to the source tree, except
+to the "build" directory.
+
+Comparison to previous waf build system
+=======================================
+
+The new configure uses the same concept as our custom layer above waf, which
+made the checks generally declarative. In fact, most checks were ported
+straight, changing only to the new syntax.
+
+Some of the internal and user-visible conventions are extremely similar. For
+example, the new system creates a build dir and writes to it by default.
+
+The choice of Python as implementation language is unfortunate. Shell was
+considered, but discarded for being too fragile, error prone, and PITA-ish.
+Lua would be reasonable, but is too fragmented, and requires external
+dependencies to do meaningful UNIX scripting. There is nothing else left that
+is widely supported enough, does not require external dependencies, and which
+isn't something that I would not touch without gloves. Bootstrapping a system
+implemented in C was considered, but deemed too problematic.
+
+mpv's custom configure
+======================
+
+All of the configuration process is handled with a mostly-declarative approach.
+Each configure check is a call to a "check" function, which takes various named
+arguments. The check function is obviously always called, even if the
+corresponding feature is disabled.
+
+A simple example using pkg-config would be::
+
+check("-vdpau*",
+ desc = "VDPAU acceleration",
+ deps = "x11",
+ fn = lambda: check_pkg_config("vdpau >= 0.2"),
+ sources = ["video/filter/vf_vdpaupp.c",
+ "video/out/vo_vdpau.c",
+ "video/vdpau.c",
+ "video/vdpau_mixer.c"])
+
+This defines a feature called ``vdpau`` which can be enabled or disabled by
+the users with configure flags (that's the meaning of ``-``). This feature
+depends on another feature whose name is ``x11``, and the autodetection check
+consists of running ``pkg-config`` and looking for ``vdpau`` with version
+``>= 0.2``. If the check succeeds a ``#define HAVE_VDPAU 1`` will be added to
+``config.h``, if not ``#define HAVE_VDPAU 0`` will be added (the ``*`` on the
+feature name triggers emitting of such defines).
+
+The defines names are automatically prepended with ``HAVE_``, capitalized, and
+some special characters are replaced with underscores.
+
+If the test succeeds, the listed source files are added to the build.
+
+Read the inline-documentation on the check function in configure_common.py for
+details. The following text only gives a crude overview.
+
+Configure tests
+---------------
+
+The check function has a ``fn`` parameter. This function is called when it's
+time to perform actual configure checks. Most check calls in configure make
+this a lambda, so the actual code to run can be passed inline as a function
+argument. (This is similar to the old waf based system, just that functions
+like check_pkg_config returned a function as result, which hid the indirection.)
+
+One central function is ``check_cc``. It's quite similar to the waf-native
+function with the same name. One difference is that there is no ``mandatory``
+option - instead it always returns a bool for success. On success, the passed
+build flags are appended to the check's build flags. This makes it easier to
+compose checks. For example::
+
+check(desc = "C11/C99",
+ fn = lambda: check_cc(flags = "-std=c11") or
+ check_cc(flags = "-std=c99"),
+ required = "No C11 or C99 support.")
+
+This tries to use -std=c11, but allows a fallback to -std=c99.
+
+If the entire check fails, none of the added build flags are added. For example,
+you could chain multiple tests like this::
+
+check("-vapoursynth*",
+ fn = lambda: check_pkg_config("vapoursynth >= 24") and
+ check_pkg_config("vapoursynth-script >= 23"))
+
+If the second check fails, the final executable won't link to ``vapoursynth``.
+(Note that this test could just make a single check_pkg_config call, and pass
+each dependency as separate argument.)
+
+Source files
+------------
+
+configure generates the list of source files and writes it to config.mak. You
+can add source files at any point in configure, but normally they're added with
+the ``sources`` parameter in each feature check. This is done because a larger
+number of source files depend on configure options, so having it all in the same
+place as the check is slightly nicer than having a separate conditional mess in
+the fixed Makefile.
+
+Configure phases, non-declarative actions
+-----------------------------------------
+
+configure was written to be as single-pass as possible. It doesn't even put the
+checks in any lists or so (except for the outcome). Handling of ``--enable-...``
+etc. options is done while running configure. If you pass e.g.
+``--enable-doesntexist``, configure will complain about an unknown
+``doesntexist`` feature only once all checks have been actually run.
+
+Although this is slightly weird, it is done so that the ``configure`` file
+itself can be a flat file with simple top-down execution. It enables you to add
+arbitrary non-declarative checks and such between the ``check`` calls.
+
+One thing you need to be aware of is that if ``--help`` was passed to configure,
+it will run in "help mode". You may have to use ``is_running()`` to check
+whether it's in a mode where checks are actually executed. Outside of this mode,
+``dep_enabled()`` will fail.
+
+Makefile
+--------
+
+Although most source files are added from configure, this build system still
+may require you to write some make. In particular, generated files are not
+handled by configure.
+
+make is bad. It's hard to use, hard to debug, and extremely fragile. It may be
+replaced by something else in the future, including the possibility of turning
+configure into waf-light.
+
+Variables:
+
+ ``BUILD``
+ The directory for build output. Can be a relative path, usually set to
+ ``build``.
+ ``ROOT``
+ The directory that contains ``configure``. Usually the root directory
+ of the repository. Source files need to be addressed relative to this
+ path. Can be a relative path, usually set to ``.``.
diff --git a/Makefile.new b/Makefile.new
new file mode 100644
index 0000000000..58ce1a0319
--- /dev/null
+++ b/Makefile.new
@@ -0,0 +1,93 @@
+BUILDDIR = build
+
+include $(BUILDDIR)/config.mak
+include $(ROOT)/TOOLS/makefile_common.mak
+
+PROJNAME = mpv
+
+.PHONY: .force
+
+$(BUILD)/generated/version.h: $(ROOT)/version.sh .force
+ $(LOG) "VERSION" $@
+ $(Q) mkdir -p $(@D)
+ $(Q) $(ROOT)/version.sh --versionh=$@
+
+$(BUILD)/generated/ebml_types.h $(BUILD)/generated/ebml_defs.c: $(ROOT)/TOOLS/matroska.py
+ $(LOG) "EBML" "$(BUILD)/generated/ebml_types.h $(BUILD)/generated/ebml_defs.c"
+ $(Q) mkdir -p $(@D)
+ $(Q) $< --generate-header > $(BUILD)/generated/ebml_types.h
+ $(Q) $< --generate-definitions > $(BUILD)/generated/ebml_defs.c
+
+$(BUILD)/generated/%.inc: $(ROOT)/TOOLS/file2string.py $(ROOT)/%
+ $(LOG) "INC" $@
+ $(Q) mkdir -p $(@D)
+ $(Q) $^ > $@
+
+# Dependencies for generated files unfortunately need to be declared manually.
+# This is because dependency scanning is a gross shitty hack by concept, and
+# requires that the compiler successfully compiles a file to get its
+# dependencies. This results in a chicken-and-egg problem, and in conclusion
+# it works for static header files only.
+# If any headers include generated headers, you need to manually set
+# dependencies on all source files that include these headers!
+# And because make is fucking shit, you actually need to set these on all files
+# that are generated from these sources, instead of the source files. Make rules
+# specify recipes, not dependencies.
+# (Possible counter measures: always generate them with an order dependency, or
+# introduce separate dependency scanner step for creating .d files.)
+
+$(BUILD)/common/version.o: $(BUILD)/generated/version.h
+
+$(BUILD)/osdep/mpv.o: $(BUILD)/generated/version.h
+
+$(BUILD)/demux/demux_mkv.o $(BUILD)/demux/ebml.o: \
+ $(BUILD)/generated/ebml_types.h $(BUILD)/generated/ebml_defs.c
+
+$(BUILD)/video/out/x11_common.o: $(BUILD)/generated/etc/mpv-icon-8bit-16x16.png.inc \
+ $(BUILD)/generated/etc/mpv-icon-8bit-32x32.png.inc \
+ $(BUILD)/generated/etc/mpv-icon-8bit-64x64.png.inc \
+ $(BUILD)/generated/etc/mpv-icon-8bit-128x128.png.inc
+
+$(BUILD)/input/input.o: $(BUILD)/generated/etc/input.conf.inc
+
+$(BUILD)/player/main.o: $(BUILD)/generated/etc/builtin.conf.inc
+
+$(BUILD)/sub/osd_libass.o: $(BUILD)/generated/sub/osd_font.otf.inc
+
+$(BUILD)/player/lua.o: $(BUILD)/generated/player/lua/defaults.lua.inc \
+ $(BUILD)/generated/player/lua/assdraw.lua.inc \
+ $(BUILD)/generated/player/lua/options.lua.inc \
+ $(BUILD)/generated/player/lua/osc.lua.inc \
+ $(BUILD)/generated/player/lua/ytdl_hook.lua.inc \
+ $(BUILD)/generated/player/lua/stats.lua.inc \
+ $(BUILD)/generated/player/lua/console.lua.inc \
+
+$(BUILD)/player/javascript.o: $(BUILD)/generated/player/javascript/defaults.js.inc
+
+$(BUILD)/osdep/macosx_application.m $(BUILD)/video/out/cocoa_common.m: \
+ $(BUILD)/generated/TOOLS/osxbundle/mpv.app/Contents/Resources/icon.icns.inc
+
+# Why doesn't wayland just provide fucking libraries like anyone else, instead
+# of overly complex XML generation bullshit?
+# And fuck make too.
+
+# $(1): path prefix to the protocol, $(1)/$(2).xml is the full path.
+# $(2): the name of the protocol, without path or extension
+define generate_trash =
+$$(BUILD)/video/out/wayland_common.o \
+$$(BUILD)/video/out/opengl/context_wayland.o \
+: $$(BUILD)/generated/wayland/$(2).c $$(BUILD)/generated/wayland/$(2).h
+$$(BUILD)/generated/wayland/$(2).c: $(1)/$(2).xml
+ $$(LOG) "WAYSHC" $$@
+ $$(Q) mkdir -p $$(@D)
+ $$(Q) $$(WAYSCAN) private-code $$< $$@
+$$(BUILD)/generated/wayland/$(2).h: $(1)/$(2).xml
+ $$(LOG) "WAYSHH" $$@
+ $$(Q) mkdir -p $$(@D)
+ $$(Q) $$(WAYSCAN) client-header $$< $$@
+endef
+
+$(eval $(call generate_trash,$(WL_PROTO_DIR)/unstable/idle-inhibit/,idle-inhibit-unstable-v1))
+$(eval $(call generate_trash,$(WL_PROTO_DIR)/stable/presentation-time/,presentation-time))
+$(eval $(call generate_trash,$(WL_PROTO_DIR)/stable/xdg-shell/,xdg-shell))
+$(eval $(call generate_trash,$(WL_PROTO_DIR)/unstable/xdg-decoration/,xdg-decoration-unstable-v1))
diff --git a/TOOLS/configure_common.py b/TOOLS/configure_common.py
new file mode 100644
index 0000000000..ea2f32ea1a
--- /dev/null
+++ b/TOOLS/configure_common.py
@@ -0,0 +1,740 @@
+import atexit
+import os
+import shutil
+import subprocess
+import sys
+import tempfile
+
+# ...the fuck?
+NoneType = type(None)
+function = type(lambda: 0)
+
+programs_info = [
+ # env. name default
+ ("CC", "cc"),
+ ("PKG_CONFIG", "pkg-config"),
+ ("WINDRES", "windres"),
+ ("WAYSCAN", "wayland-scanner"),
+]
+
+install_paths_info = [
+ # env/opt default
+ ("PREFIX", "/usr/local"),
+ ("BINDIR", "$(PREFIX)/bin"),
+ ("LIBDIR", "$(PREFIX)/lib"),
+ ("CONFDIR", "$(PREFIX)/etc/$(PROJNAME)"),
+ ("INCDIR", "$(PREFIX)/include"),
+ ("DATADIR", "$(PREFIX)/share"),
+ ("MANDIR", "$(DATADIR)/man"),
+ ("DOCDIR", "$(DATADIR)/doc/$(PROJNAME)"),
+ ("HTMLDIR", "$(DOCDIR)"),
+ ("ZSHDIR", "$(DATADIR)/zsh"),
+ ("CONFLOADDIR", "$(CONFDIR)"),
+]
+
+# for help output only; code grabs them manually
+other_env_vars = [
+ # env # help text
+ ("CFLAGS", "User C compiler flags to append."),
+ ("CPPFLAGS", "Also treated as C compiler flags."),
+ ("LDFLAGS", "C compiler flags for link command."),
+ ("TARGET", "Prefix for default build tools (for cross compilation)"),
+ ("CROSS_COMPILE", "Same as TARGET."),
+]
+
+class _G:
+ help_mode = False # set if --help is specified on the command line
+
+ log_file = None # opened log file
+
+ temp_path = None # set to a private, writable temporary directory
+ build_dir = None
+ root_dir = None
+ out_of_tree = False
+
+ install_paths = {} # var name to path, see install_paths_info
+
+ programs = {} # key is symbolic name, like CC, value is string of
+ # executable name - only set if check_program was called
+
+ exe_format = "elf"
+
+ cflags = []
+ ldflags = []
+
+ config_h = "" # new contents of config.h (written at the end)
+ config_mak = "" # new contents of config.mak (written at the end)
+
+ sources = []
+
+ state_stack = []
+
+ feature_opts = {} # keyed by option name, values are:
+ # "yes": force enable, like --enable-<feature>
+ # "no": force disable, like: --disable-<feature>
+ # "auto": force auto detection, like --with-<feature>=auto
+ # "default": default (same as option not given)
+
+ dep_enabled = {} # keyed by dependency identifier; value is a bool
+ # missing key means the check was not run yet
+
+
+# Convert a string to a C string literal. Adds the required "".
+def _c_quote_string(s):
+ s = s.replace("\\", "\\\\")
+ s = s.replace("\"", "\\\"")
+ return "\"%s\"" % s
+
+# Convert a string to a make variable. Escaping is annoying: sometimes, you add
+# e..g arbitrary paths (=> everything escaped), but sometimes you want to keep
+# make variable use like $(...) unescaped.
+def _c_quote_makefile_var(s):
+ s = s.replace("\\", "\\\\")
+ s = s.replace("\"", "\\\"")
+ s = s.replace(" ", "\ ") # probably
+ return s
+
+def die(msg):
+ sys.stderr.write("Fatal error: %s\n" % msg)
+ sys.stderr.write("Not updating build files.\n")
+ if _G.log_file:
+ _G.log_file.write("--- Stopping due to error: %s\n" % msg)
+ sys.exit(1)
+
+# To be called before any user checks are performed.
+def begin():
+ _G.root_dir = "."
+ _G.build_dir = "build"
+
+ for var, val in install_paths_info:
+ _G.install_paths[var] = val
+
+ for arg in sys.argv[1:]:
+ if arg.startswith("-"):
+ name = arg[1:]
+ if name.startswith("-"):
+ name = name[1:]
+ opt = name.split("=", 1)
+ name = opt[0]
+ val = opt[1] if len(opt) > 1 else ""
+ def noval():
+ if val:
+ die("Option --%s does not take a value." % name)
+ if name == "help":
+ noval()
+ _G.help_mode = True
+ continue
+ elif name.startswith("enable-"):
+ noval()
+ _G.feature_opts[name[7:]] = "yes"
+ continue
+ elif name.startswith("disable-"):
+ noval()
+ _G.feature_opts[name[8:]] = "no"
+ continue
+ elif name.startswith("with-"):
+ if val not in ["yes", "no", "auto", "default"]:
+ die("Option --%s requires 'yes', 'no', 'auto', or 'default'."
+ % name)
+ _G.feature_opts[name[5:]] = val
+ continue
+ uname = name.upper()
+ setval = None
+ if uname in _G.install_paths:
+ def set_install_path(name, val):
+ _G.install_paths[name] = val
+ setval = set_install_path
+ elif uname == "BUILDDIR":
+ def set_build_path(name, val):
+ _G.build_dir = val
+ setval = set_build_path
+ if not setval:
+ die("Unknown option: %s" % arg)
+ if not val:
+ die("Option --%s requires a value." % name)
+ setval(uname, val)
+ continue
+
+ if _G.help_mode:
+ print("Environment variables controlling choice of build tools:")
+ for name, default in programs_info:
+ print(" %-30s %s" % (name, default))
+
+ print("")
+ print("Environment variables/options controlling install paths:")
+ for name, default in install_paths_info:
+ print(" %-30s '%s' (also --%s)" % (name, default, name.lower()))
+
+ print("")
+ print("Other environment variables:")
+ for name, help in other_env_vars:
+ print(" %-30s %s" % (name, help))
+ print("In addition, pkg-config queries PKG_CONFIG_PATH.")
+ print("")
+ print("General build options:")
+ print(" %-30s %s" % ("--builddir=PATH", "Build directory (default: build)"))
+ print(" %-30s %s" % ("", "(Requires using 'make BUILDDIR=PATH')"))
+ print("")
+ print("Specific build configuration:")
+ # check() invocations will print the options they understand.
+ return
+
+ _G.temp_path = tempfile.mkdtemp(prefix = "mpv-configure-")
+ def _cleanup():
+ shutil.rmtree(_G.temp_path)
+ atexit.register(_cleanup)
+
+ # (os.path.samefile() is "UNIX only")
+ if os.path.realpath(sys.path[0]) != os.path.realpath(os.getcwd()):
+ print("This looks like an out of tree build.")
+ print("This doesn't actually work.")
+ # Keep the build dir; this makes it less likely to accidentally trash
+ # an existing dir, especially if dist-clean (wipes build dir) is used.
+ # Also, this will work even if the same-directory check above was wrong.
+ _G.build_dir = os.path.join(os.getcwd(), _G.build_dir)
+ _G.root_dir = sys.path[0]
+ _G.out_of_tree = True
+
+ os.makedirs(_G.build_dir, exist_ok = True)
+ _G.log_file = open(os.path.join(_G.build_dir, "config.log"), "w")
+
+ _G.config_h += "// Generated by configure.\n" + \
+ "#pragma once\n\n"
+
+
+# Check whether the first argument is the same type of any in the following
+# arguments. This _always_ returns val, but throws an exception if type checking
+# fails.
+# This is not very pythonic, but I'm trying to prevent bugs, so bugger off.
+def typecheck(val, *types):
+ vt = type(val)
+ for t in types:
+ if vt == t:
+ return val
+ raise Exception("Value '%s' of type %s not any of %s" % (val, type(val), types))
+
+# If val is None, return []
+# If val is a list, return val.
+# Otherwise, return [val]
+def normalize_list_arg(val):
+ if val is None:
+ return []
+ if type(val) == list:
+ return val
+ return [val]
+
+def push_build_flags():
+ _G.state_stack.append(
+ (_G.cflags[:], _G.ldflags[:], _G.config_h, _G.config_mak,
+ _G.programs.copy()))
+
+def pop_build_flags_discard():
+ top = _G.state_stack[-1]
+ _G.state_stack = _G.state_stack[:-1]
+
+ (_G.cflags[:], _G.ldflags[:], _G.config_h, _G.config_mak,
+ _G.programs) = top
+
+def pop_build_flags_merge():
+ top = _G.state_stack[-1]
+ _G.state_stack = _G.state_stack[:-1]
+
+# Return build dir.
+def get_build_dir():
+ assert _G.build_dir is not None # too early?
+ return _G.build_dir
+
+# Root directory, i.e. top level source directory, or where configure/Makefile
+# are located.
+def get_root_dir():
+ assert _G.root_dir is not None # too early?
+ return _G.root_dir
+
+# Set which type of executable format the target uses.
+# Used for conventions which refuse to abstract properly.
+def set_exe_format(fmt):
+ assert fmt in ["elf", "pe", "macho"]
+ _G.exe_format = fmt
+
+# A check is a check, dependency, or anything else that adds source files,
+# preprocessor symbols, libraries, include paths, or simply serves as
+# dependency check for other checks.
+# Always call this function with named arguments.
+# Arguments:
+# name: String or None. Symbolic name of the check. The name can be used as
+# dependency identifier by other checks. This is the first argument, and
+# usually passed directly, instead of as named argument.
+# If this starts with a "-" flag, options with names derived from this
+# are generated:
+# --enable-$option
+# --disable-$option
+# --with-$option=<yes|no|auto|default>
+# Where "$option" is the name without flag characters, and occurrences
+# of "_" are replaced with "-".
+# If this ends with a "*" flag, the result of this check is emitted as
+# preprocessor symbol to config.h. It will have the name "HAVE_$DEF",
+# and will be either set to 0 (check failed) or 1 (check succeeded),
+# and $DEF is the name without flag characters and all uppercase.
+# desc: String or None. If specified, "Checking for <desc>..." is printed
+# while running configure. If not specified, desc is auto-generated from
+# the name.
+# default: Boolean or None. If True or None, the check is soft-enabled (that
+# means it can still be disabled by options, dependency checks, or
+# the check function). If False, the check is disabled by default,
+# but can be enabled by an option.
+# deps, deps_any, deps_neg: String, array of strings, or None. If a check is
+# enabled by default/command line options, these checks are performed in
+# the following order: deps_neg, deps_any, deps
+# deps requires all dependencies in the list to be enabled.
+# deps_any requires 1 or more dependencies to be enabled.
+# deps_neg requires that all dependencies are disabled.
+# fn: Function or None. The function is run after dependency checks. If it
+# returns True, the check is enabled, if it's False, it will be disabled.
+# Typically, your function for example check for the existence of
+# libraries, and add them to the final list of CFLAGS/LDFLAGS.
+# None behaves like "lambda: True".
+# Note that this needs to be a function. If not, it'd be run before the
+# check() function is even called. That would mean the function runs even
+# if the check was disabled, and could add unneeded things to CFLAGS.
+# If this function returns False, all added build flags are removed again,
+# which makes it easy to compose checks.
+# You're not supposed to call check() itself from fn.
+# sources: String, Array of Strings, or None.
+# If the check is enabled, add these sources to the build.
+# Duplicate sources are removed at end of configuration.
+# required: String or None. If this is a string, the check is required, and
+# if it's not enabled, the string is printed as error message.
+def check(name = None, option = None, desc = None, deps = None, deps_any = None,
+ deps_neg = None, sources = None, fn = None, required = None,
+ default = None):
+
+ deps = normalize_list_arg(deps)
+ deps_any = normalize_list_arg(deps_any)
+ deps_neg = normalize_list_arg(deps_neg)
+ sources = normalize_list_arg(sources)
+
+ typecheck(name, str, NoneType)
+ typecheck(option, str, NoneType)
+ typecheck(desc, str, NoneType)
+ typecheck(deps, NoneType, list)
+ typecheck(deps_any, NoneType, list)
+ typecheck(deps_neg, NoneType, list)
+ typecheck(sources, NoneType, list)
+ typecheck(fn, NoneType, function)
+ typecheck(required, str, NoneType)
+ typecheck(default, bool, NoneType)
+
+ option_name = None
+ define_name = None
+ if name is not None:
+ opt_flag = name.startswith("-")
+ if opt_flag:
+ name = name[1:]
+ def_flag = name.endswith("*")
+ if def_flag:
+ name = name[:-1]
+ if opt_flag:
+ option_name = name.replace("_", "-")
+ if def_flag:
+ define_name = "HAVE_" + name.replace("-", "_").upper()
+
+ if desc is None and name is not None:
+ desc = name
+
+ if _G.help_mode:
+ if not option_name:
+ return
+
+ defaction = "enable"
+ if required is not None:
+ # If they are required, but also have option set, these are just
+ # "strongly required" options.
+ defaction = "enable"
+ elif default == False:
+ defaction = "disable"
+ elif deps or deps_any or deps_neg or fn:
+ defaction = "autodetect"
+ act = "enable" if defaction == "disable" else "disable"
+ opt = "--%s-%s" % (act, option_name)
+ print(" %-30s %s %s [%s]" % (opt, act, desc, defaction))
+ return
+
+ _G.log_file.write("\n--- Test: %s\n" % (name if name else "(unnnamed)"))
+
+ if desc:
+ sys.stdout.write("Checking for %s... " % desc)
+ outcome = "yes"
+
+ force_opt = required is not None
+ use_dep = True if default is None else default
+
+ # Option handling.
+ if option_name:
+ # (The option gets removed, so we can determine whether all options were
+ # applied in the end.)
+ val = _G.feature_opts.pop(option_name, "default")
+ if val == "yes":
+ use_dep = True
+ force_opt = True
+ elif val == "no":
+ use_dep = False
+ force_opt = False
+ elif val == "auto":
+ use_dep = True
+ elif val == "default":
+ pass
+ else:
+ assert False
+
+ if not use_dep:
+ outcome = "disabled"
+
+ # Dependency resolution.
+ # But first, check whether all dependency identifiers really exist.
+ for d in deps_neg + deps_any + deps:
+ dep_enabled(d) # discard result
+ if use_dep:
+ for d in deps_neg:
+ if dep_enabled(d):
+ use_dep = False
+ outcome = "conflicts with %s" % d
+ break
+ if use_dep:
+ any_found = False
+ for d in deps_any:
+ if dep_enabled(d):
+ any_found = True
+ break
+ if len(deps_any) > 0 and not any_found:
+ use_dep = False
+ outcome = "not any of %s found" % (", ".join(deps_any))
+ if use_dep:
+ for d in deps:
+ if not dep_enabled(d):
+ use_dep = False
+ outcome = "%s not found" % d
+ break
+
+ # Running actual checks.
+ if use_dep and fn:
+ push_build_flags()
+ if fn():
+ pop_build_flags_merge()
+ else:
+ pop_build_flags_discard()
+ use_dep = False
+ outcome = "no"
+
+ # Outcome reporting and terminating if dependency not found.
+ if name:
+ _G.dep_enabled[name] = use_dep
+ if define_name:
+ add_config_h_define(define_name, 1 if use_dep else 0)
+ if use_dep:
+ _G.sources += sources
+ if desc:
+ sys.stdout.write("%s\n" % outcome)
+ _G.log_file.write("--- Outcome: %s (%s=%d)\n" %
+ (outcome, name if name else "(unnnamed)", use_dep))
+
+ if required is not None and not use_dep:
+ print("Warning: %s" % required)
+
+ if force_opt and not use_dep:
+ die("This feature is required.")
+
+
+# Runs the process like with execv() (just that args[0] is used for both command
+# and first arg. passed to the process).
+# Returns the process stdout output on success, or None on non-0 exit status.
+# In particular, this logs the command and its output/exit status to the log
+# file.
+def _run_process(args):
+ p = subprocess.Popen(args, stdout = subprocess.PIPE,
+ stderr = subprocess.PIPE,
+ stdin = -1)
+ (p_out, p_err) = p.communicate()
+ # We don't really want this. But Python 3 in particular makes it too much of
+ # a PITA (think power drill in anus) to consistently use byte strings, so
+ # we need to use "unicode" strings. Yes, a bad program could just blow up
+ # our butt here by outputting invalid UTF-8.
+ # Weakly support Python 2 too (gcc outputs UTF-8, which crashes Python 2).
+ if type(b"") != str:
+ p_out = p_out.decode("utf-8")
+ p_err = p_err.decode("utf-8")
+ status = p.wait()
+ _G.log_file.write("--- Command: %s\n" % " ".join(args))
+ if p_out:
+ _G.log_file.write("--- stdout:\n%s" % p_out)
+ if p_err:
+ _G.log_file.write("--- stderr:\n%s" % p_err)
+ _G.log_file.write("--- Exit status: %s\n" % status)
+ return p_out if status == 0 else None
+
+# Run the C compiler, possibly including linking. Return whether the compiler
+# exited with success status (0 exit code) as boolean. What exactly it does
+# depends on the arguments. Generally, it constructs a source file and tries
+# to compile it. With no arguments, it compiles, but doesn't link, a source
+# file that contains a dummy main function.
+# Note: these tests are cumulative.
+# Arguments:
+# include: String, array of strings, or None. For each string
+# "#include <$value>" is added to the top of the source file.
+# decl: String, array of strings, or None. Added to the top of the source
+# file, global scope, separated by newlines.
+# expr: String or None. Added to the body of the main function. Despite the
+# name, needs to be a full statement, needs to end with ";".
+# defined: String or None. Adds code that fails if "#ifdef $value" fails.
+# flags: String, array of strings, or None. Each string is added to the
+# compiler command line.
+# Also, if the test succeeds, all arguments are added to the CFLAGS
+# (if language==c) written to config.mak.
+# link: String, array of strings, or None. Each string is added to the
+# compiler command line, and the compiler is made to link (not passing
+# "-c").
+# A value of [] triggers linking without further libraries.
+# A value of None disables the linking step.
+# Also, if the test succeeds, all link strings are added to the LDFLAGS
+# written to config.mak.
+# language: "c" for C, "m" for Objective-C.
+def check_cc(include = None, decl = None, expr = None, defined = None,
+ flags = None, link = None, language = "c"):
+ assert language in ["c", "m"]
+
+ use_linking = link is not None
+
+ contents = ""
+ for inc in normalize_list_arg(include):
+ contents += "#include <%s>\n" % inc
+ for dec in normalize_list_arg(decl):
+ contents += "%s\n" % dec
+ for define in normalize_list_arg(defined):
+ contents += ("#ifndef %s\n" % define) + \
+ "#error failed\n" + \
+ "#endif\n"
+ if expr or use_linking:
+ contents += "int main(int argc, char **argv) {\n";
+ if expr:
+ contents += expr + "\n"
+ contents += "return 0; }\n"
+ source = os.path.join(_G.temp_path, "test." + language)
+ _G.log_file.write("--- Test file %s:\n%s" % (source, contents))
+ with open(source, "w") as f:
+ f.write(contents)
+
+ flags = normalize_list_arg(flags)
+ link = normalize_list_arg(link)
+
+ outfile = os.path.join(_G.temp_path, "test")
+ args = [get_program("CC"), source]
+ args += _G.cflags + flags
+ if use_linking:
+ args += _G.ldflags + link
+ args += ["-o%s" % outfile]
+ else:
+ args += ["-c", "-o%s.o" % outfile]
+ if _run_process(args) is None:
+ return False
+
+ _G.cflags += flags
+ _G.ldflags += link
+ return True
+
+# Run pkg-config with function arguments passed as command arguments. Typically,
+# you specify pkg-config version expressions, like "libass >= 0.14". Returns
+# success as boolean.
+# If this succeeds, the --cflags and --libs are added to CFLAGS and LDFLAGS.
+def check_pkg_config(*args):
+ args = list(args)
+ pkg_config_cmd = [get_program("PKG_CONFIG")]
+
+ cflags = _run_process(pkg_config_cmd + ["--cflags"] + args)
+ if cflags is None:
+ return False
+ ldflags = _run_process(pkg_config_cmd + ["--libs"] + args)
+ if ldflags is None:
+ return False
+
+ _G.cflags += cflags.split()
+ _G.ldflags += ldflags.split()
+ return True
+
+def get_pkg_config_variable(arg, varname):
+ typecheck(arg, str)
+ pkg_config_cmd = [get_program("PKG_CONFIG")]
+
+ res = _run_process(pkg_config_cmd + ["--variable=" + varname] + [arg])
+ if res is not None:
+ res = res.strip()
+ return res
+
+# Check for a specific build tool. You pass in a symbolic name (e.g. "CC"),
+# which is then resolved to a full name and added as variable to config.mak.
+# The function returns a bool for success. You're not supposed to use the
+# program from configure; instead you're supposed to have rules in the makefile
+# using the generated variables.
+# (Some configure checks use the program directly anyway with get_program().)
+def check_program(env_name):
+ for name, default in programs_info:
+ if name == env_name:
+ val = os.environ.get(env_name, None)
+ if val is None:
+ prefix = os.environ.get("TARGET", None)
+ if prefix is None:
+ prefix = os.environ.get("CROSS_COMPILE", "")
+ # Shitty hack: default to gcc if a prefix is given, as binutils
+ # toolchains generally provide only a -gcc wrapper.
+ if prefix and default == "cc":
+ default = "gcc"
+ val = prefix + default
+ # Interleave with output. Sort of unkosher, but dare to stop me.
+ sys.stdout.write("(%s) " % val)
+ _G.log_file.write("--- Trying '%s' for '%s'...\n" % (val, env_name))
+ try:
+ _run_process([val])
+ except O