Recognizing emotional facial expressions is a part of perceptual decision-making processes in the brain. Arriving at a decision for the brain becomes more difficult when available sensory information is limited or ambiguous. We used clear and noisy pictures with happy and angry emotional expressions and asked 32 participants to categorize these pictures based on emotions. There were significant differences in behavioral accuracy and reaction time between the decisions of clear and noisy images. The functional magnetic resonance imaging activations showed that the inferior occipital gyrus (IOG), fusiform gyrus (FG), amygdala (AMG) and ventrolateral prefrontal cortex (VPFC) along with other regions were active during the perceptual decision-making process. Using dynamic causal modeling analysis, we obtained three important results. First, from Bayesian model selection (BMS) approach, we found that the feed-forward network activity was enhanced during the processing of clear and noisy happy faces more than during the processing of clear angry faces. The AMG mediated this feed-forward connectivity in processing of clear and noisy happy faces, whereas the AMG mediation was absent in case of clear angry faces. However, this network activity was enhanced in case of noisy angry faces. Second, connectivity parameters obtained from Bayesian model averaging (BMA) suggested that the forward connectivity dominated over the backward connectivity during such processes. Third, based on the BMA parameters, we found that the easier tasks modulated effective connectivity from IOG to FG, AMG, and VPFC more than the difficult tasks did. These findings suggest that both parallel and hierarchical brain processes are at work during perceptual decision-making of negative, positive, unambiguous and ambiguous emotional expressions, but the AMG-mediated feed-forward network plays a dominant role in such decisions.