This article presents matrix backpropagation algorithms for the QR decomposition of matrices \(A_{m,n}\), that are either square (m = n), wide (m < n), or deep (m > n), with rank \(k = min(m, n)\). Furthermore, we derive a novel matrix backpropagation result for the LQ decomposition for deep input matrices. Differentiable QR decomposition offers a numerically stable, computationally efficient method to solve least squares problems frequently encountered in machine learning and computer vision. Software implementation across popular deep learning frameworks (PyTorch, TensorFlow, MXNet) incorporate the methods for general use within the deep learning community. Furthermore, this article aids the practitioner in understanding the matrix backpropagation methodology as part of larger computational graphs, and hopefully, leads to new lines of research.