cs 18: e2

Research Article

Hyperparameter optimisation for Capsule Networks

Download189 downloads
  • @ARTICLE{10.4108/eai.13-7-2018.158416,
        author={Gagana B and S Natarajan},
        title={Hyperparameter optimisation for Capsule Networks},
        journal={EAI Endorsed Transactions on Cloud Systems: Online First},
        volume={},
        number={},
        publisher={EAI},
        journal_a={CS},
        year={2019},
        month={4},
        keywords={hyperparameter optimisation, Stochastic numeric healthcare data, Capsule Networks, ReLU, performance benchmarks},
        doi={10.4108/eai.13-7-2018.158416}
    }
    
  • Gagana B
    S Natarajan
    Year: 2019
    Hyperparameter optimisation for Capsule Networks
    CS
    EAI
    DOI: 10.4108/eai.13-7-2018.158416
Gagana B1,*, S Natarajan1
  • 1: PES University, Bangalore
*Contact email: gaganab20496@gmail.co

Abstract

Convolutional Neural Networks and its contemporary variants have proven to be ruling benchmarks for most image processing tasks but resort to pooling techniques and routing mechanisms that affect classification accuracy and lose spatial relationship information between involved data points. Hence, Hinton et al, proposed a layered architecture called Capsule Networks (Capsnets) which outperform traditional systems by replacing pooling techniques with dynamic routing abilities. Capsnets are, thus, en-route to proving themselves as prospective future benchmarks in visual imagery tasks by surpassing existing state-of-theart results on the MNIST dataset. The two novel aspects inspected in this paper are: the enhancement of this performance on CIFAR-10 through regularization and hyperparameter optimization which, henceforth, augment applicability to stochastic numeric healthcare data helping uncover newer challenges of predictive neural networks.