In this paper, we consider estimators for an additive functional of \(\phi\), which is defined as \(\theta(P;\phi)=\sum_{i=1}^k\phi(p_i)\), from \(n\) i.i.d. random samples drawn from a discrete distribution \(P=(p_1,...,p_k)\) with alphabet size \(k\). We propose a minimax optimal estimator for the estimation problem of the additive functional. We reveal that the minimax optimal rate is substantially characterized by the divergence speed of the fourth derivative of \(\phi\). As a result, we show that there is no consistent estimator if the divergence speed of the fourth derivative of \(\phi\) is larger than \(p^{-4}\). Furthermore, if the divergence speed of the fourth derivative of \(\phi\) is \(p^{4-\alpha}\) for \(\alpha \in (0,1)\), the minimax optimal rate is obtained within a universal multiplicative constant as \(\frac{k^2}{(n\ln n)^{2\alpha}} + \frac{k^{2-2\alpha}}{n}\).