Sensitivity analysis offers the opportunity to explore the sensitivity (influence) of
parameters on a model. This work applies global sensitivity analysis to deep learning and
optimization algorithms for the analysis of the influence of their hyperparameters. For deep
learning, we analyzed hyperparameters such as type of optimizers, learning rate, batch size,
etc. We analyzed these hyperparameters for deep neural networks such as ResNet18,
AlexNet, and GoogleNet. For the optimization algorithms, we analyzed hyperparameters of
two single-objective and two multi-objective state-of-the-art global optimization
evolutionary algorithms as an algorithm configuration problem. We investigate the quality
of influence hyperparameters have on the performance of algorithms in terms of their
direct effect and interaction effect with other hyperparameters. Using three sensitivity
analysis methods, Morris LHS, Morris, and Sobol, to systematically analyze tuneable
hyperparameters, the framework reveals the behaviours of hyperparameters to sampling
methods and performance metrics. That is, it answers questions like what hyperparameters
influence patterns, how they interact, how much they interact, and how much their direct
influence is. Consequently, the ranking of hyperparameters suggests their order of tuning,
and the pattern of influence reveals the stability of the algorithms.
Resources:
Assessing Ranking and Effectiveness of Evolutionary Algorithm Hyperparameters Using
Global Sensitivity Analysis Methodologies, Swarm and Evolutionary
Computation: https://arxiv.org/pdf/2207.04820.pdf
Sensitivity Analysis for Deep Learning: Ranking Hyper-parameter
Influence: https://ieeexplore.ieee.org/document/9643336