Abstractive text summarization is more challenging than the extractive one since it is performed by paraphrasing the entire contents of the text, which has a higher di culty. But, it produces a more natural summary and higher inter-sentence cohesion. Recurrent Neural Network (RNN) has experienced success in summarizing abstractive texts for English and Chinese texts. The Bidirectional Gated Recurrent Unit (BiGRU) RNN architecture is used so that the resulted summaries are influenced by the surrounding words. In this research, such a method is applied for Bahasa Indonesia to improve the text summarizations those are commonly developed using some extractive methods with low inter-sentence cohesion. An evaluation on a dataset of Indonesian journal documents shows that the proposed model is capable of summarizing the overall contents of testing documents into some summaries with high similarities to the provided abstracts. The proposed model resulting success in understanding source text for generating summarization.