This study aimed to compare an essay-style undergraduate medical assessment with modified essay, multiple-choice question (MCQ) and objective structured clinical examination (OSCE) undergraduate medical assessments in predicting students' clinical performance (predictive validity), and to determine the relative contributions of the written (modified essay and MCQ) assessment and OSCE to predictive validity. Before and after cohort study. One medical school running a 6-year undergraduate course. Study participants included 137 Year 5 medical students followed into their trainee intern year. Aggregated global ratings by senior doctors, junior doctors and nurses as well as comprehensive structured assessments of performance in the trainee intern year. Students' scores in the new examinations predicted performance significantly better than scores in the old examinations, with correlation coefficients increasing from 0.05-0.44 to 0.41-0.81. The OSCE was a stronger predictor of subsequent performance than the written assessments but combining assessments had the strongest predictive validity. Using more comprehensive, more reliable and more authentic undergraduate assessment methods substantially increases predictive validity.