To search, Click below search items.

 

All Published Papers Search Service

Title

An Improved the Performance of GRU Model based on Batch Normalization for Sentence Classification

Author

Muhammad Zulqarnain Rozaida Ghazali, Shihab Hamad Khaleefah, Ayesha Rehan, Muhammad Shehzad, Yana Mazwin Mohmad Hassim

Citation

Vol. 19  No. 9  pp. 176-186

Abstract

Sentiment classification is a very popular topic for identifying user opinions and has been extensively applied in Natural Language Processing (NLP) tasks. Gated Recurrent Unit (GRU) has been successfully implemented to NLP mechanism with comparable, outstanding results. GRUs network performs better on sequential learning tasks and overcomes the issues of vanishing and explosion of gradients in standard recurrent neural networks (RNNs). In this paper, we describe to improve the efficiency of the GRU framework based on batch normalization and replace traditional tanh activation function with Leaky ReLU (LReLU). Empirically, we present that our model, with slight hyperparameters, and tuning the statistic vectors, obtains excellent results on benchmark datasets for sentiment classification. The proposed BN-GRU model performs well as compared to various existing approaches in terms of accuracy and loss function. The experimental results has shown that the proposed model achieved better performance over several state-of-the-art approaches on two benchmark datasets, IMDB dataset with 82.4% accuracy, and SSTb dataset with 88.1% binary classification accuracy and 49.9% Fine-grained accuracy respectively. The proposed results are obtained to show the proposed model capable to minimize the loss function, and extract long-term dependencies with a compact architecture that obtained superior performance with significantly fewer parameters.

Keywords

RNN, GRU, Batch Normalization, Long-term dependencies, Sentence Classification.

URL

http://paper.ijcsns.org/07_book/201909/20190920.pdf