Much of Metron’s work applies a Bayesian inference framework to problems and systems involving uncertainty. Statistical inference allows us to move beyond simply describing a set of sample data to finding patterns that help us estimate the properties of a sampled population. Bayesian inference improves this process by including past information when making estimates and future decisions. In Bayesian inference, uncertainty is modeled by probability distributions. Observations about the system are converted to likelihood functions which are used to compute posterior probability distributions on the state of the system. This posterior distribution reflects the way the observation has modified, and hopefully decreased, the uncertainty in our knowledge of the state of the system.

This process involves a simple concept called Bayes’ theorem first proposed by Thomas Bayes (1701-1761). A version of his rule was published posthumously in 1763 and fully developed by Pierre-Simon Laplace in 1812. In spite of its simplicity, Bayesian inference has proven highly effective when applied to a wide array of complex problems ranging from detection and tracking involving low signal-to-noise ratios and high false alarm rates to optimum decision-making in the face of ambiguity.