Machine Learning, Bayesian Optimization, Gaussian Processes, Software Engineering
Dr. Leonard Papenmeier
Machine Learning Researcher, focused on Bayesian Optimization and Gaussian Processes
Download as PDF.
Publications
2025 | Leonard Papenmeier, Luigi Nardi Bencher: Simple and Reproducible Benchmarking for Black-Box Optimization Accepted at the CODEML Workshop at the Forty-Second International Conference on Machine Learning. |
2025 | Leonard Papenmeier, Matthias Poloczek, Luigi Nardi Understanding High-Dimensional Bayesian Optimization Accepted at the Forty-Second International Conference on Machine Learning. |
2025 | Leonard Papenmeier*, Nuojin Cheng*, Stephen Becker, Luigi Nardi Exploring Exploration in Bayesian Optimization Accepted at the Forty-First Conference on Uncertainty in Artificial Intelligence. * Equal contribution. |
2025 | Nuojin Cheng*, Leonard Papenmeier*, Stephen Becker, Luigi Nardi A Unified Framework for Entropy Search and Expected Improvement in Bayesian Optimization Accepted at the Forty-Second International Conference on Machine Learning. * Equal contribution. |
2023 | Leonard Papenmeier, Luigi Nardi, Matthias Poloczek Bounce: Reliable High-Dimensional Bayesian Optimization for Combinatorial and Mixed Spaces Advances in Neural Information Processing Systems 36, NeurIPS 2023, New Orleans. |
2023 | Erik Hellsten*, Carl Hvarfner*, Leonard Papenmeier*, Luigi Nardi High-dimensional Bayesian Optimization with Group Testing Preprint. * Equal contribution. |
2022 | Leonard Papenmeier, Luigi Nardi, Matthias Poloczek Increasing the Scope as You Learn: Adaptive Bayesian Optimization in Nested Subspaces Advances in Neural Information Processing Systems 35, NeurIPS 2022, New Orleans. |
2017 | Leonard Papenmeier (then: Hövelmann), Christoph M. Friedrich Fasttext and Gradient Boosted Trees at GermEval-2017 on Relevance Classification and Document-level Polarity GermEval Shared Task on Aspect-based Sentiment in Social Media Customer Feedback (GSCL), Hamburg, 2017. My Bachelor's thesis resulted in this submission. |
Education
Since 09/2020 | PhD Candidate in Machine Learning at Lund University and Wallenberg AI, Autonomous Systems and Software Program, Sweden |
2019 | Exchange Semester (Data Science), NMBU, Ås, Norway |
2017-2020 | Master: Applied Computer Science, Ruhr-University, Bochum, Germany. Final grade: 95% (excellent) |
2013-2017 | Bachelor: Software Engineering, Dortmund University of Applied Sciences, Germany. Final grade: 1.6 (good) |
Work Experience
2018-2020 | Working student: Deep Learning and Computer Vision, img.ly GmbH, Bochum, Germany |
2016-2018 | Working student: Full-stack software development, adesso AG, Cologne, Germany |
2013-2016 | Apprenticeship: IT specialist, adesso AG & Chamber of Industry and Commerce (IHK), Germany |
Teaching
2021-2024 | Teaching Assistant: Artificial Intelligence (EDAP01), Lund University, Sweden |
2022-2023 | Teaching Assistant: Applied Machine Learning (EDAN96), Lund University, Sweden |
2023 | Teaching Assistant: Advanced Applied Machine Learning (EDAP30), Lund University, Sweden |
2020-2021 | Teaching Assistant: Applied Machine Learning (EDAN95), Lund University, Sweden |
Reviewing
Theses
2025 |
Leonard Papenmeier Bayesian Optimization in High Dimensions A Journey Through Subspaces and ChallengesPhD ThesisThis thesis explores the challenges and advancements in high-dimensional Bayesian optimization (HDBO), focusing on understanding, quantifying, and improving optimization techniques in high-dimensional spaces. Bayesian optimization (BO) is a powerful method for optimizing expensive black-box functions, but its effectiveness diminishes as the dimensionality of the search space increases due to the curse of dimensionality. The thesis introduces novel algorithms and methodologies to make HDBO more practical. Key contributions include the development of the BAxUS algorithm, which leverages nested subspaces to optimize high-dimensional problems without estimating the dimensionality of the effective subspace. Additionally, the Bounce algorithm extends these techniques to combinatorial and mixed spaces, providing robust solutions for real-world applications. The thesis also explores the quantification of exploration in acquisition functions, proposing new methods of quantifying exploration and strategies to design more effective optimization approaches. Furthermore, this work analyzes why simple BO setups have recently shown promising performance in high-dimensional spaces, challenging the conventional belief that BO is limited to low-dimensional problems. This thesis offers insights and recommendations for designing more efficient HDBO algorithms by identifying and addressing failure modes such as vanishing gradients and biases in model fitting. Through a combination of theoretical analysis, empirical evaluations, and practical implementations, this thesis contributes to the field of BO by advancing our understanding of high-dimensional optimization and providing actionable methods to improve its performance in complex scenarios |
2020 |
Leonard Papenmeier Semantic Representations in Variational Autoencoders as a Model of the Visual SystemMaster ThesisThis thesis was written at the Institute of Neural Computation at Ruhr-University Bochum, Germany and supervised by Laurenz Wiskott and Zahra Fayyaz. The goal was to investigate the role of semantic representations in the visual system. I investigated the hypothesis that variational autoencoders (VAEs) learn semantic representations of images by analyzing latent representations of VAEs trained on different datasets. We could not find strong evidence for this hypothesis.The thesis was graded with 100% (excellent). |
2017 |
Leonard Papenmeier (then: Hövelmann) Sentiment Analysis Based on Word Embeddings: Possible Improvements and Transfer to the German LanguageBachelor ThesisThis thesis was written at Dortmund University of Applied Sciences and Arts, Germany and supervised by Christoph M. Friedrich. I used (then very recent) word embeddings to improve sentiment on German text. I also participated in the GermEval Shared Task on Aspect-based Sentiment in Social Media Customer Feedback (GSCL) 2017 (see publications) and achieved the best result for the German language on one subtask and the second-best result on another subtask.The thesis was graded with 1.0 (very good). |
Programming Languages & Frameworks
Python, PyTorch, Keras, Java, JavaScript, HTML, CSS, Spring Framework, Angular, TypescriptScholarships
2017-2020 | Scholarship Program of the Friedrich-Ebert Foundation |
PhD Student
Bayesian Optimization, Gaussian Processes, Machine Learning, AutoML