Simulation has long been integrated in anaesthesiology training, yet a comprehensive review of its effectiveness is presently lacking. Using meta-analysis and critical narrative analysis, we synthesized the evidence for the effectiveness of simulation-based anaesthesiology training. We searched MEDLINE, ERIC, and SCOPUS through May 2011 and included studies using simulation to train health professional learners. Data were abstracted independently and in duplicate. We included 77 studies (6066 participants). Compared with no intervention (52 studies), simulation was associated with moderate to large pooled effect sizes (ESs) for all outcomes (ES range 0.60-1.05) except for patient effects (ES -0.39). Compared with non-simulation instruction (11 studies), simulation was associated with moderate effects for satisfaction and skills (ES 0.39 and 0.42, respectively), large effect for behaviours (1.77), and small effects for time, knowledge, and patient effects (-0.18 to 0.23). In 17 studies comparing alternative simulation interventions, training in non-technical skills (e.g. communication) and medical management compared with training in medical management alone was associated with negligible effects for knowledge and skills (four studies, ES range 0.14-0.15). Debriefing using multiple vs single information sources was associated with negligible effects for time and skills (three studies, ES range -0.07 to 0.09). Our critical analysis showed inconsistency in measurement of non-technical skills and consistency in the (ineffective) design of debriefing. Simulation in anaesthesiology appears to be more effective than no intervention (except for patient outcomes) and non-inferior to non-simulation instruction. Few studies have clarified the key instructional designs for simulation-based anaesthesiology training.