Hyperparameter Setting of LSTM-based Language Model using Grey Wolf Optimizer

BILAL ZAHRAN AUFA

Informasi Dasar

118 kali
20.04.4298
006.31
Karya Ilmiah - Skripsi (S1) - Reference

Hyperparameters are the most essential part of a deep learning model. They have a big impact for the performance of the model. Recent works show that if the hyperparameters of a Long Short Term Memory (LSTM) are carefully adjusted, its performance achieves the same performance as the more complex LSTM model. Hence, it opens opportunities for Swarm Intelligence (SI) algorithms, such as Grey Wolf Optimizer (GWO), that have promising performance in optimization problems to improve the LSTM performance by optimizing the best combination of its hyperparameters. In this paper, the GWO is exploited to optimize the LSTM hyperparameters for a language modeling task. Evaluation for the Penn Tree Bank dataset shows that GWO is capable of giving an optimum hyperparameters of the LSTM.

Subjek

Machine Learning
 

Katalog

Hyperparameter Setting of LSTM-based Language Model using Grey Wolf Optimizer
 
-
Indonesia

Sirkulasi

Rp. 0
Rp. 0
Tidak

Pengarang

BILAL ZAHRAN AUFA
Perorangan
SUYANTO, ANDITYA ARIFIANTO
 

Penerbit

Universitas Telkom, S1 Informatika
Bandung
2020

Koleksi

Kompetensi

 

Download / Flippingbook

 

Ulasan

Belum ada ulasan yang diberikan
anda harus sign-in untuk memberikan ulasan ke katalog ini