Key Challenges in Foundation Models (… and some solutions!)

Prof. Volkan Cevher

Laboratory for Information and Inference Systems, EPFL


Thanks to neural networks (NNs), faster computation, and massive datasets, machine learning is under increasing pressure to provide automated solutions to even harder real-world tasks beyond human performance with ever faster response times due to potentially huge technological and societal benefits. Unsurprisingly, the NN learning formulations present fundamental challenges to the back-end learning algorithms despite their scalability. In this talk, we will work backwards from the “customer”‘s perspective and highlight these challenges specifically on the Foundation Models based on NNs. We will then explain our solutions to some of these challenges, focusing mostly on robustness aspects. In particular, we will show how the existing theory and methodology for robust training misses the mark and how we can bridge the theory and the practice.

Volkan Cevher received the B.Sc. (valedictorian) in electrical engineering from Bilkent University in Ankara, Turkey, in 1999 and the Ph.D. in electrical and computer engineering from the Georgia Institute of Technology in Atlanta, GA in 2005. He was a Research Scientist with the University of Maryland, College Park from 2006-2007 and also with Rice University in Houston, TX, from 2008-2009. Currently, he is an Associate Professor at the Swiss Federal Institute of Technology Lausanne and a Faculty Fellow in the Electrical and Computer Engineering Department at Rice University. His research interests include machine learning, signal processing theory,  optimization theory and methods, and information theory. Dr. Cevher is an ELLIS fellow and was the recipient of the ICML AdvML Best Paper Award in 2023, Google Faculty Research award in 2018, the IEEE Signal Processing Society Best Paper Award in 2016, a Best Paper Award at CAMSAP in 2015, a Best Paper Award at SPARS in 2009, and an ERC CG in 2016 as well as an ERC StG in 2011. 


  • 00


  • 00


  • 00


  • 00



Oct 10 2023


14:30 - 15:00