A complex interplay of single-neuron properties and the recurrent network structure shapes the activity of individual cortical neurons, which differs in general from the respective population activity. We develop a theory that makes it possible to investigate the influence of both network structure and single-neuron properties on the single-neuron statistics in block-structured sparse random networks of spiking neurons. In particular, the theory predicts the neuron-level autocorrelation times, also known as intrinsic timescales, of the neuronal activity. The theory is based on a postulated extension of dynamic mean-field theory from rate networks to spiking networks, which is validated via simulations. It accounts for both static variability, e.g. due to a distributed number of incoming synapses per neuron, and dynamical fluctuations of the input. To illustrate the theory, we apply it to a balanced random network of leaky integrate-and-fire neurons, a balanced random network of generalized linear model neurons, and a biologically constrained network of leaky integrate-and-fire neurons. For the generalized linear model network, an analytical solution to the colored noise problem allows us to obtain self-consistent firing rate distributions, single-neuron power spectra, and intrinsic timescales. For the leaky integrate-and-fire networks, we obtain the same quantities by means of a novel analytical approximation of the colored noise problem that is valid in the fluctuation-driven regime. Our results provide a further step towards an understanding of the dynamics in recurrent spiking cortical networks.