File(s) under permanent embargo
Hyper-parameter optimization in classification: to-do or not-to-do
journal contribution
posted on 2020-07-01, 00:00 authored by N Tran, Jean-Guy Schneider, I Weber, A K QinHyper-parameter optimization is a process to find suitable hyper-parameters for predictive models. It typically incurs highly demanding computational costs due to the need of the time-consuming model training process to determine the effectiveness of each set of candidate hyper-parameter values. A priori, there is no guarantee that hyper-parameter optimization leads to improved performance. In this work, we propose a framework to address the problem of whether one should apply hyper-parameter optimization or use the default hyper-parameter settings for traditional classification algorithms. We implemented a prototype of the framework, which we use a basis for a three-fold evaluation with 486 datasets and 4 algorithms. The results indicate that our framework is effective at supporting modeling tasks in avoiding adverse effects of using ineffective optimizations. The results also demonstrate that incrementally adding training datasets improves the predictive performance of framework instantiations and hence enables “life-long learning.”
History
Journal
Pattern recognitionVolume
103Article number
107245Pagination
1 - 12Publisher
ElsevierLocation
Amsterdam, The NetherlandsPublisher DOI
ISSN
0031-3203Language
engPublication classification
C1 Refereed article in a scholarly journalUsage metrics
Categories
Keywords
Hyper-parameter optimizationFrameworkBayesian optimizationMachine learningIncremental learningScience & TechnologyTechnologyComputer Science, Artificial IntelligenceEngineering, Electrical & ElectronicComputer ScienceEngineeringALGORITHMInformation SystemsArtificial Intelligence and Image Processing
Licence
Exports
RefWorks
BibTeX
Ref. manager
Endnote
DataCite
NLM
DC