I am a Professor ( Professeur des Universités ) in the Statistics & Optimization team of the Institut Mathématique de Toulouse, teaching in the Department of Mathematics at Université Toulouse III - Paul Sabatier.
Numerical optimization
Statistical learning
Stochastic programming
Optimal transport
Distributed algorithms
One post-doc position is currently available on statistical fairness using optimal transport theory co-mentored with J.-M. Loubes, official offer coming soon, send me an email for more information.
Several internships/Ph.D. positions on related topics are suceptible to open in Spring 2025, potential candidates can contact me directly by email.
Sep. 2024 : I am a co-PI of an ANITI (Toulouse's AI Cluster) chair on Trust and Responsibility in Artificial Intelligence led by JM. Loubes
Sep. 2024 : I am taking the responsibility of the 2nd year of the Master SID in Data Science & Engineering
Aug. 2024 : Our library skwdro has a v1! We provide efficient code for machine learning with Wasserstein Distributional Robustness. The project is still very much under development but this version marks a first step in our objective of popularizing this approach. Available on Github and pip; paper coming soon !
Jul. 2024 : Our project MAD (Maths of Automatic Differentiation) led by S. Vaiter (CNRS & U. Nice) is funded by the ANR !
June 2024 : Yu-Guan's PhD thesis has been awarded the Université Grenoble Alpes's PhD award !
Apr. 2024 : I am the happy father of Louise ! This also means I will be travelling less frequently in the coming months (maybe years?).
Oct. 2023 : Congratulations to Yu-Guan Hsieh for his fanstastic Ph.D. ! The jury included Constantinos Daskalakis, Sylvain Sorin, Nicolò Cesa-Bianchi, Alexandre d’Aspremont, Anatoli Juditsky, Maryam Kamgarpour, and his co-advisors: J. Malick, P. Mertikopoulos, and I.
Sep. 2023 : After seven great years in Grenoble, I have moved to Toulouse!
Dec. 2022 : Congratulations to Gilles Bareilles for passing his Ph.D. with flying colors! The jury included Nadia Brauner, Jalal Fadili, Claudia Sagastizábal, Jean-Charles Gilbert, Mathurin Massias, Claude Lemaréchal and his co-advisors: J. Malick and I.
Oct. 2022 : Waïss Azizian started his PhD. He is co-supervised by Panayotis Mertikopoulos and Jérôme Malick (CNRS & LJK, Grenoble) and will work on non-convex optimization for learning problems.
Mar. 2022 : Two interns have joined the team: Waïss Azizian (student at ENS Paris for his third internship with us!) and Julien Prando (student at UGA). They will both work on regularization aspects of distributionally robust methods.
Mar. 2022 : The second "Journée SMAI MAS-MODE" was a success: 7 talks and around 40 participants (in person!) at the intersection between optimization, probability, and statistics. website
Feb. 2022 : Victor Mercklé started his PhD. He is co-supervised by Ievgen Redko (Univ. St-Etienne) and will work on congestion games in networks.
Dec. 2021 : I defended my Habilitation à Diriger des Recherches (HDR) on Wed., December 15th! The jury was composed of Alexandre d'Aspremont (reviewer), Jérôme Bolte (reviewer), Adrian Lewis (reviewer), Jalal Fadili, Adeline Leclercq-Samson (president), and Julien Mairal. manuscript defense slides
June 2021 : Waiss Azizian (Student at ENS Paris) visited again our team for six weeks as part of his (previously online) internship to work on distributionally robust methods.
Nov. 2020 : Congrats to Dmitry “Mitya” Grishchenko for successfully defending his Ph.D.! The jury included Pascal Bianchi, Peter Richtarik, Julien Mairal, Samuel Vaiter, Alexander Gasnikov and his co-advisors: J. Malick, M. Amini, and I.
June 2020 : Waiss Azizian (Student at ENS Paris) visited our team for six weeks as part of his (previously online) internship to work on extragradient methods.
July 2019 : I was awarded an ANR JCJC grant (Young Research grant from the French Research Agency). The project is named STROLL: Harnessing Structure in Optimization for Large-scale Learning.
June 2019 : I am part of the chair on Optimisation and Learning, lead by Jerome Malick and Yurii Nesterov, funded by the Grenoble AI Institute.
June 2019 : Elnur Gasanov (Student at KAUST with P. Richtarik) visited our team for five weeks to work on distributed optimization.
Mar. 2019 : Alexander Rogozin (Student at MIPT Moscow with A. Gasnikov) visited our team for two weeks to work on distributed structure optimization.
Jan. 2019 : Mathias Chastan began his PhD between ST MicroElectronics and our team. He is co-supervised by A. Lam, J. Malick and I and will work on machine learning for industrial defects detection.
Right now, I am very interested in the interplay between optimization, statistics, and optimal transport theory in order to produce more robust data-driven models
W. Azizian, F. Iutzeler, J. Malick : Exact Generalization Guarantees for (Regularized) Wasserstein Distributionally Robust Models, NeurIPS, Dec. 2023. PDF
C. Dapogny, F. Iutzeler, A. Meda, B. Thibert : Entropy-regularized Wasserstein distributionally robust shape and topology optimization, Structural and Multidisciplinary Optimization, vol. 66, art. 42, 2023. PDF
W. Azizian, F. Iutzeler, J. Malick : Regularization for Wasserstein Distributionally Robust Optimization, ESAIM: Control, Optimisation, Calculus of Variations, 2023. PDF
we are developing a Python package for Wasserstein Distributionally Robust Optimization, check it out :
skwdro: Distributionally robust machine learning with Pytorch and Scikit-learn wrappers GitHub
Also, I like to investigate what happens when optimization algorithms arrive near non-differentiability points or near borders, both theoretically and practically:
W. Azizian, F. Iutzeler, J. Malick, P. Mertikopoulos : On the rate of convergence of Bregman proximal methods in constrained variational inequalities, to appear in SIAM Journal on Optimization, 2024. PDF
G. Bareilles, F. Iutzeler, J. Malick : Newton acceleration on manifolds identified by proximal-gradient methods, to appear in Mathematical Programming, 2023. PDF
F. Iutzeler, J. Malick: Nonsmoothness in Machine Learning: specific structure, proximal identification, and applications, Set-Valued and Variational Analysis, vol. 28, no. 4, pp. 661-678, 2020. PDF
W. Azizian, F. Iutzeler, J. Malick, and P. Mertikopoulos: The last-iterate convergence rate of optimistic mirror descent in stochastic variational inequalities , 34th Annual Conference on Learning Theory (COLT), 2021. PDF
June 2024 : Distributionally Robust Optimization and Statistical learning, Journée optimisation et controle, Pau (France)
June 2024 : Distributionally Robust Optimization and Statistical learning, ENAC, Toulouse (France)
Oct. 2023 : Distributionally Robust Optimization and Statistical learning, MADSTAT, Toulouse School of Economics (France) Slides
Oct. 2023 : Distributionally Robust Optimization and Statistical learning, SPOT, Toulouse (France)
Jan. 2023 : on Wasserstein Distributionally Robust Optimization, Séminaire Proba. Stat., Nice (France)
Jan. 2023 : on Wasserstein Distributionally Robust Optimization, Séminaire Optimisation et Statistique, Toulouse (France)
Jun. 2022 : Identification et exploitation de structure en apprentissage régularisé, Journées MODE, Limoges (France)
May. 2022 : Identifying & Using Structure in Machine Learning, Rencontres Statistiques Lyonnaises, Lyon (France)
Apr. 2022 : Identifying & Using Structure in Machine Learning, INRIA Maasai, Nice (France)
Aug. 2021 : Harnessing Structure in Composite Optimization problems, IFIP TC7, Quito (Ecuador) virtual Slides
June 2021 : Harnessing Structure in Regularized Empirical Risk Minimization, CAp virtual Slides
Nov. 2020 : Nonsmooth regularizations in Machine Learning: structure of the solutions, identification, and applications, IMAG Montpellier virtual Slides
Sep. 2020 : a Randomized Proximal Gradient Method with Structure-Adapted Sampling, Journées SMAI MODE virtual Slides
Mar. 2020 : Harnessing Structure in Optimization for Machine Learning, Optimization for Machine Learning, CIRM (France) Slides
Oct. 2018 : Distributed Learning with Sparse Communications and Structure Identification, Séminaire INRIA Magnet, Lille (France)
Jul. 2018 : Distributed Learning with Sparse Communications and Structure Identification, International Symposium on Mathematical Programming (ISMP), Bordeaux (France)
June 2018 : Distributed Learning with Sparse Communications and Structure Identification, Séminaire Polaris and D.A.T.A., Grenoble (France)
May. 2018 : Distributed Learning with Sparse Communications and Structure Identification, Journées de Statistique, Saclay (France)
Apr. 2017 : Monotonicity, Acceleration, Inertia, and the proximal gradient algorithm , Optimization and Statistical Learning, Les Houches (France) Slides
Nov. 2016 : Gossip Algorithms: Tutorial and Recent advances , SMILE in Paris, Paris (France) Slides Part I Slides Part II
Oct. 2016 : Modified fixed points iterations and applications to randomized and accelerated optimization algorithms , Workshop Cavalieri, Paris (France)
Sep. 2016 : Practical acceleration for some optimization methods using relaxation and inertia, Seminaire d'Analyse non lineaire et Optimisation, Avignon (France)
June 2016 : Practical acceleration for some optimization methods using relaxation and inertia, Seminaire Signal-Image de l'Insitut de Mathematiques de Bordeaux, Bordeaux (France)
June 2016 : Practical accelerations for the alternating direction method of multipliers , PICOF Workshop , Autrans (France)
May 2016 : Descente par coordonnéees stochastique dan l'algorithme du point fixe et application aux méthod d'optimisation , Congres d'Analyse Numerique (CANUM) , Obernai (France)
Nov. 2015 : Relaxation and Inertia on the Proximal Point Algorithm , Titan Workshop , Grenoble (France)
Nov. 2015 : Relaxation and Inertia on Fixed point algorithms , Journées EDP Rhone-Alpes-Auvergne (JERAA), Clermont-Ferrand (France)
Mar. 2015 : Online Relaxation Method for Improving Linear Convergence Rates of the ADMM , Benelux meeting on Systems and Control, Lommel (Belgium)
Aug. 2014 : Asynchronous Distributed Optimization , Journées MAS, Toulouse (France)
May. 2014 : Distributed Optimization Techniques for Learning over Big Data , 2014 ESSEC/Centrale-Supélec Conference Bridging Worlds in Big Data, ESSEC CNIT Campus, La Défense Paris (France)
Apr. 2014 : Distributed Asynchronous optimization using the ADMM, Large graphs and networks seminar, Université Catholique de Louvain-la-Neuve , ICTEAM institute, Louvain-La-Neuve (Belgium) Slides
Jul. 2013 : Distributed Optimization using a Randomized Alternating Direction Method of Multipliers , Digicosme Research Day, Digiteo, Gif-sur-Yvette
Nov. 2012 : Distributed Estimation of the Average Value in Wireless Sensor Networks , Alcatel-Lucent Chair Seminar, Supélec, Gif-sur-Yvette
Apr. 2012 : Some useful results on Matrix Products for Signal Processing , Ph.D. Candidates Seminar, Telecom ParisTech, Paris
Oct. 2011 : Distributed Maximal Value Estimation , Ph.D. Candidates Seminar, Telecom ParisTech, Paris
Sep. 2011 : Estimation distribuée du maximum dans un réseau de capteurs, Colloque GRETSI, Bordeaux (France)
Since 09/2023 : Professor at Univ. Toulouse III
2021 : Habilitation degree from Univ. Grenoble Alpes
09/2015-08/2023 : Assistant Professor at Univ. Grenoble Alpes
01/2015-08/2015 : Post-doc at Université Catholique de Louvain-la-Neuve
01/2014-12/2014 : Post-doc at Centrale-Supélec
2013 : Ph.D. degree from Telecom Paris
2010 : Engineer degree from Telecom Paris and M.Sc. from Sorbonne Université.
Bureau 1R1-225
Institut de Mathématiques de Toulouse
118, route de Narbonne
31062 Toulouse Cedex 9
FRANCE