Lunch colloquium Clara Menzen (PhD at DDC)
05 April 2023 12:30 till 13:30 - Location: lecture room f (Simon Stevin), 3me - By: DCSC | Add to my calendar
"Projecting basis functions with tensor networks for Gaussian process regression – and what this has to do with Green AI"
When it comes to the design of Bayesian machine learning algorithms, Gaussian processes (GPs) are a popular method of choice. Being flexible function approximators, GPs are capable of using the information in data to learn rich representations and complex structures. Unfortunately, these appealing features come at a cost of poor scalability, with a computational complexity that grows cubically with the number of data points, making GPs prohibitively expensive for large-scale data. In literature, there have been many efforts to scale up GPs, among which a parametric approximation uses a linear combination of basis functions, where the accuracy of the approximation depends on the total number of basis functions. We present an approach that allows us to use an exponential amount of basis functions without the corresponding exponential computational complexity. The key idea to enable this is using low-rank tensor networks (TNs). TNs are networks of multidimensional arrays, also called tensors, that can approximate data or models in a compressed format while preserving the essence of the information. Exploiting the properties of TNs, we are not limited to neither a data set with a small number of data points nor to a small dimensionality. Because TNs reduce compute while retaining good performance, they can be considered a promising tool for Green AI, which is defined among other things by considering efficiency and accuracy equally important for algorithmic development.