Machine learning algorithms often take inspiration from established results and knowledge from statistical physics. A prototypical example is the Boltzmann machine algorithm for supervised learning, which utilizes knowledge of classical thermal partition functions and the Boltzmann distribution. Recent advances in the study of non-equilibrium quantum integrable systems, which never thermalize, have lead to the exploration of a wider class of statistical ensembles. These systems may be described by the so-called generalized Gibbs ensemble, which incorporates a number of "effective temperatures". We propose that these generalized Gibbs ensembles can be successfully applied as the basis of a Boltzmann-machine-like learning algorithm, which operates by learning the optimal values of effective temperatures. We apply our algorithm to the classification of handwritten digits in the MNIST database. While lower error rates can be found with other state-of-the-art algorithms, we find that our algorithm reaches relatively low error rates while learning a much smaller number of parameters than would be needed in a traditional Boltzmann machine, thereby reducing computational cost.