We propose a semismooth Newton coordinate descent (SNCD) algorithm for elastic-net penalized robust regression with Huber loss and quantile regression. The SNCD is a novel combination of the semismooth Newton and coordinate descent algorithms. It is designed for loss functions with only first order derivatives and is scalable to high-dimensional models. Unlike the standard coordinate descent method, the SNCD updates the regression parameters and the corresponding subdifferentials based on the concept of Newton derivatives. In addition, an adaptive version of the "strong rule" for screening predictors is incorporated to gain extra efficiency. As an important application of the proposed algorithm, we show that the SNCD can be used to compute the solution paths for penalized quantile regression. We establish the convergence properties of the algorithm. Through numerical experiments, we demonstrate that the proposed algorithm works well for high-dimensional data with heavy-tailed errors, and that for quantile regression SNCD is considerably faster than the existing method and has better optimization performance. A breast cancer gene expression data set is used to illustrate the proposed algorithm.