ExplaiNing SeqUences in REcommendations

The ENSURE project looks at ways of improving the transparency and decision support for recommender systems (like Amazon and Spotify), in recommendation scenarios that contain both surprising recommendations and trade-offs. The research agenda involves:

  • Gaining an understanding of people's concerns regarding personalisation for sequences of recommended items.
  • Gaining an understanding of people's views on the kinds of explanations that alleviate their concerns and help them to make good decisions.
  • Producing guidelines for algorithms for constructing explainable recommender sequences.
  • Developing algorithms for explaining sequences containing both novelty and trade-offs effectively, and while considering privacy concerns. This includes investigating the role of context and personal characteristics.
  • Facilitating a dialogue between policy makers, researchers, and the general public regarding the findings above.


17/7/17: Case study with Blendle accepted to NWO ICT with Industry on Personalized (and diverse) news selection. Seeking PhD students and Postdocs. November 27 - December 1, 2017.

10/7/17: Paper accepted with Christoph Lofi at the Complex-Rec workshop: Towards Analogy-based Recommendation: Benchmarking of Perceived Analogy Semantics.

5/4/17: `Sequences of Diverse Song Recommendations: An exploratory study in a commercial system'' was accepted as an extended abstract and poster to UMAP'17.

10/3/17: We are hiring a 2 year postdoc on explanations in recommender systems!