Research Article
Review of Optimization in Improving Extreme Learning Machine
@ARTICLE{10.4108/eai.17-9-2021.170960, author={Nilesh Rathod and Sunil Wankhade}, title={Review of Optimization in Improving Extreme Learning Machine}, journal={EAI Endorsed Transactions on Industrial Networks and Intelligent Systems}, volume={8}, number={28}, publisher={EAI}, journal_a={INIS}, year={2021}, month={9}, keywords={Extreme learning machine (ELM), Single-feedforward neural networks; Kernel functions, Sensitivity, Input weights and Activation bias}, doi={10.4108/eai.17-9-2021.170960} }
- Nilesh Rathod
Sunil Wankhade
Year: 2021
Review of Optimization in Improving Extreme Learning Machine
INIS
EAI
DOI: 10.4108/eai.17-9-2021.170960
Abstract
Now a days Extreme Learning Machine has gained a lot of interest because of its noteworthy qualities over single hidden-layer feedforward neural networks and the kernel functions. Even if ELM has many advantages, it has some potential shortcomings such as performance sensitivity to the underlying state of the hidden neurons, input weights and the choice of functions of activation. To overcome the limitations of traditional ELM, analysts have devised numerical methods to optimise specific parts of ELM in order to enhance ELM performance for a variety of complicated difficulties and applications. Hence through this study, we intend to study the different algorithms developed for optimizing the ELM to enhance its performance in the aspects of survey criteria such as datasets, algorithm, objectives, training time, accuracy, error rate and the hidden neurons. This study will help other researchers to find out the research issues that lowering the performance of the ELM.
Copyright © 2021 Nilesh Rathod et al., licensed to EAI. This is an open access article distributed under the terms of the Creative Commons Attribution license, which permits unlimited use, distribution and reproduction in any medium so long as the original work is properly cited.