When performing classic uncertainty reduction based on dynamic data, a large number of reservoir simulations need to be evaluated at high computational cost. As an alternative, we construct Bayesian emulators that mimic the dominant behaviour of the reservoir simulator, and which are several orders of magnitude faster to evaluate. We combine these emulators within an iterative procedure that involves substantial but appropriate dimensional reduction of the output space, enabling a more effective and efficient uncertainty reduction on the input space than traditional methods, and with a more comprehensive understanding of the associated uncertainties. This study uses a Bayesian statistical approach for uncertainty reduction of complex models which is designed to address problems with high number of both input and output parameters. We detail how to efficiently choose sets of outputs that are suitable for emulation and that are highly informative to reduce the input parameter space and investigate different classes of outputs and objective functions. We use output emulators and implausibility analysis iteratively to perform input space reduction, and we discuss the strengths and weaknesses of certain popular classes of objective function in this context. We demonstrate our approach via an application to a benchmark synthetic model (built using public data from a Brazilian offshore field) in an early stage of development using four years of historical data and four producers. This study investigates traditional simulation outputs (e.g. production data) and also novel classes of outputs, such as misfit indexes and summaries of outputs. We show that despite there being a large number (2,136) of possible outputs, only a very small number (16) was sufficient to represent the available information; these informative outputs were utilized using fast and efficient emulators at each iteration (or wave) of the history match to perform the uncertainty reduction procedure successfully. Using this small set of outputs, we were able to substantially reduce the input space by removing 99.8% of the original volume. We found that a small set of physically meaningful individual production outputs were the most informative at early waves, which once emulated, resulted in the highest space reduction, while more complex but popular objective functions that combine several outputs were only modestly useful at later waves. The latter point is due to objective functions such as misfit indices having complex surfaces that can lead to low-quality emulators and hence result in non-informative outputs. We present an iterative emulator-based Bayesian uncertainty reduction process in which all possible input parameter configurations that lead to statistically acceptable matches between the simulated and observed data are identified. This methodology presents four central characteristics: (1) incorporation of a powerful dimension reduction on the output space, resulting in significantly increased efficiency, (2) effective reduction of the input space, (3) computational efficiency, and (4) provision of a better understanding of the complex geometry of the input and output spaces.