OPTION: OPTImization Algorithm Benchmarking ONtology
Many platforms for benchmarking optimization algorithms offer users the possibility of sharing their experimental data with the purpose of promoting reproducible and reusable research. However, different platforms use different data models and formats, which drastically inhibits identification of relevant data sets, their interpretation, and their interoperability. Consequently, a semantically rich, ontology-based, machine-readable data model is highly desired. We report in this paper on the development of such an ontology, which we name OPTION (OPTImization algorithm benchmarking ONtology). Our ontology provides the vocabulary needed for semantic annotation of the core entities involved in the benchmarking process, such as algorithms, problems, and evaluation measures. It also provides means for automated data integration, improved interoperability, powerful querying capabilities and reasoning, thereby enriching the value of the benchmark data. We demonstrate the utility of OPTION by annotating and querying a corpus of benchmark performance data from the BBOB workshop data - a use case which can be easily extended to cover other benchmarking data collections.
READ FULL TEXT