Content-based artwork recommendation: integrating painting metadata with neural and manually-engineered visual features. Messina, P., Dominguez, V., Parra, D., Trattner, C., & Soto, A. User Modeling and User-Adapted Interaction, 2019.
Content-based artwork recommendation: integrating painting metadata with neural and manually-engineered visual features [link]Paper  abstract   bibtex   1 download  
Recommender Systems help us deal with information overload by suggesting relevant items based on our personal preferences. Although there is a large body of research in areas such as movies or music, artwork recommendation has received comparatively little attention, despite the continuous growth of the artwork market. Most previous research has relied on ratings and metadata, and a few recent works have exploited visual features extracted with deep neural networks (DNN) to recommend digital art. In this work, we contribute to the area of content-based artwork recommendation of physical paintings by studying the impact of the aforementioned features (artwork metadata, neural visual features), as well as manually-engineered visual features, such as naturalness, brightness and contrast. We implement and evaluate our method using transactional data from UGallery.com, an online artwork store. Our results show that artwork recommendations based on a hybrid combination of artist preference, curated attributes, deep neural visual features and manually-engineered visual features produce the best performance. Moreover, we discuss the trade-off between automatically obtained DNN features and manually-engineered visual features for the purpose of explainability, as well as the impact of user profile size on predictions. Our research informs the development of next-generation content-based artwork recommenders which rely on different types of data, from text to multimedia.
@Article{	  messina:etal:2019,
  author	= {Pablo Messina and Vicente Dominguez and Denis Parra and
		  Christoph Trattner and Alvaro Soto},
  title		= {Content-based artwork recommendation: integrating painting
		  metadata with neural and manually-engineered visual
		  features},
  journal	= {User Modeling and User-Adapted Interaction},
  volume	= {29},
  number	= {2},
  year		= {2019},
  abstract	= {Recommender Systems help us deal with information overload
		  by suggesting relevant items based on our personal
		  preferences. Although there is a large body of research in
		  areas such as movies or music, artwork recommendation has
		  received comparatively little attention, despite the
		  continuous growth of the artwork market. Most previous
		  research has relied on ratings and metadata, and a few
		  recent works have exploited visual features extracted with
		  deep neural networks (DNN) to recommend digital art. In
		  this work, we contribute to the area of content-based
		  artwork recommendation of physical paintings by studying
		  the impact of the aforementioned features (artwork
		  metadata, neural visual features), as well as
		  manually-engineered visual features, such as naturalness,
		  brightness and contrast. We implement and evaluate our
		  method using transactional data from UGallery.com, an
		  online artwork store. Our results show that artwork
		  recommendations based on a hybrid combination of artist
		  preference, curated attributes, deep neural visual features
		  and manually-engineered visual features produce the best
		  performance. Moreover, we discuss the trade-off between
		  automatically obtained DNN features and manually-engineered
		  visual features for the purpose of explainability, as well
		  as the impact of user profile size on predictions. Our
		  research informs the development of next-generation
		  content-based artwork recommenders which rely on different
		  types of data, from text to multimedia.},
  url		= {https://link.springer.com/article/10.1007/s11257-018-9206-9}
}

Downloads: 1