Natural language generation (NLG) plays an important role in a Spoken Dialogue System. Recurrent Neural Network (RNN)-based approaches have shown promising in tackling NLG tasks. This paper presents approaches to enhance gating mechanism applied for RNN-based natural language generator, in which an attentive dialog act representation is introduced, and two gating mechanisms are proposed to semantically gate input sequences before RNN computation. The proposed RNN-based generators can be learned from unaligned data by jointly training both sentence planning and surface realization to produce natural language responses. The model was extensively evaluated on four different NLG domains. The results show that the proposed generators achieved better performance on all the NLG domains in comparison to the previous generators.