A dynamical system is said to undergo rate-induced tipping when it fails to track its quasi-equilibrium state due to an above-critical-rate change of system parameters. We study a prototypical model for rate-induced tipping, the saddle-node normal form subject to time-varying equilibrium drift and noise. We find that both most commonly used early-warning indicators, increase in variance and increase in autocorrelation, occur not when the equilibrium drift is fastest but with a delay. We explain this delay by demonstrating that the most likely trajectory for tipping also crosses the tipping threshold with a delay and therefore the tipping itself is delayed. We find solutions of the variational problem determining the most likely tipping path using numerical continuation techniques. The result is a systematic study of the tipping delay in the plane of two parameters, distance from tipping threshold and noise intensity.