In recent years, especially with the developments in machine learning and data mining, recommender systems have gained increased popularity. They are technologies that provide recommendations in various domains from our everyday lives, from food to movies and many more. Deciding whether an item interests a user or not is predicted by the recommender system by examining the collected data on user preferences and previous ratings, and the information on products. In order to produce qualitative recommendations, and improve the user experience as a result, optimizing the utilized recommender system algorithms is essential. Since most of these systems are highly-sensitive to their corresponding parameters' setting, one of the tools used by people working in machine learning for this purpose is the so-called hyperparameter optimization or hyperparameter tuning process. In this study, we present two optimization techniques: Bayesian optimization, which is used to optimize expensive-to-evaluate objective functions, and Nelder-Mead, a downhill simplex method that is mainly used in other machine learning applications. Then, we provide a realistic evaluation and comparison of these two techniques, for four different collaborative filtering algorithms, on three real-world datasets and using three different evaluation metrics. Lastly, we discuss the results of our experiments, providing recommendations on the tuning methods in different application scenarios.

In recent years, especially with the developments in machine learning and data mining, recommender systems have gained increased popularity. They are technologies that provide recommendations in various domains from our everyday lives, from food to movies and many more. Deciding whether an item interests a user or not is predicted by the recommender system by examining the collected data on user preferences and previous ratings, and the information on products. In order to produce qualitative recommendations, and improve the user experience as a result, optimizing the utilized recommender system algorithms is essential. Since most of these systems are highly-sensitive to their corresponding parameters' setting, one of the tools used by people working in machine learning for this purpose is the so-called hyperparameter optimization or hyperparameter tuning process. In this study, we present two optimization techniques: Bayesian optimization, which is used to optimize expensive-to-evaluate objective functions, and Nelder-Mead, a downhill simplex method that is mainly used in other machine learning applications. Then, we provide a realistic evaluation and comparison of these two techniques, for four different collaborative filtering algorithms, on three real-world datasets and using three different evaluation metrics. Lastly, we discuss the results of our experiments, providing recommendations on the tuning methods in different application scenarios.

Hyperparameter optimization in recommender systems : Bayesian optimization vs. Nelder-Mead

Bujari, Diedon
2020/2021

Abstract

In recent years, especially with the developments in machine learning and data mining, recommender systems have gained increased popularity. They are technologies that provide recommendations in various domains from our everyday lives, from food to movies and many more. Deciding whether an item interests a user or not is predicted by the recommender system by examining the collected data on user preferences and previous ratings, and the information on products. In order to produce qualitative recommendations, and improve the user experience as a result, optimizing the utilized recommender system algorithms is essential. Since most of these systems are highly-sensitive to their corresponding parameters' setting, one of the tools used by people working in machine learning for this purpose is the so-called hyperparameter optimization or hyperparameter tuning process. In this study, we present two optimization techniques: Bayesian optimization, which is used to optimize expensive-to-evaluate objective functions, and Nelder-Mead, a downhill simplex method that is mainly used in other machine learning applications. Then, we provide a realistic evaluation and comparison of these two techniques, for four different collaborative filtering algorithms, on three real-world datasets and using three different evaluation metrics. Lastly, we discuss the results of our experiments, providing recommendations on the tuning methods in different application scenarios.
BERNARDIS, CESARE
ING - Scuola di Ingegneria Industriale e dell'Informazione
28-apr-2021
2020/2021
In recent years, especially with the developments in machine learning and data mining, recommender systems have gained increased popularity. They are technologies that provide recommendations in various domains from our everyday lives, from food to movies and many more. Deciding whether an item interests a user or not is predicted by the recommender system by examining the collected data on user preferences and previous ratings, and the information on products. In order to produce qualitative recommendations, and improve the user experience as a result, optimizing the utilized recommender system algorithms is essential. Since most of these systems are highly-sensitive to their corresponding parameters' setting, one of the tools used by people working in machine learning for this purpose is the so-called hyperparameter optimization or hyperparameter tuning process. In this study, we present two optimization techniques: Bayesian optimization, which is used to optimize expensive-to-evaluate objective functions, and Nelder-Mead, a downhill simplex method that is mainly used in other machine learning applications. Then, we provide a realistic evaluation and comparison of these two techniques, for four different collaborative filtering algorithms, on three real-world datasets and using three different evaluation metrics. Lastly, we discuss the results of our experiments, providing recommendations on the tuning methods in different application scenarios.
File allegati
File Dimensione Formato  
2021_04_Bujari.pdf

accessibile in internet solo dagli utenti autorizzati

Descrizione: Thesis
Dimensione 3.92 MB
Formato Adobe PDF
3.92 MB Adobe PDF   Visualizza/Apri

I documenti in POLITesi sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10589/175652