MNM-Team

Publication list

LMU
Home Contact

Supervised Bachelor's theses of Korbinian Staudacher

Theses and projects (PhD, MSc, BSc, Project)

  1. Khanh Nguyen. Ein Brute-force Solver für lineare Gleichungssysteme mit minimalem Hamminggewicht in C. 9 2025. Link to this entry
    BibTeX Entry
    @misc{nguy25, author = {Khanh Nguyen}, title = {{Ein} {Brute-force} {Solver} für lineare {Gleichungssysteme} mit minimalem {Hamminggewicht} in C}, year = {2025}, key = {nguy25}, month = {9}, school = {Ludwig-Maximilians-Universität München}, supervisors = {Korbinian Staudacher}, type = {Bachelorthesis}, }
  2. Max Polster. Automatisierte Erzeugung von fehlerkorrigierten Schaltkreisen im Quantencomputing. 3 2025. Link to this entry
    BibTeX Entry
    @misc{pols25, author = {Max Polster}, title = {{Automatisierte} {Erzeugung} von fehlerkorrigierten {Schaltkreisen} im {Quantencomputing}}, year = {2025}, key = {pols25}, month = {3}, school = {Ludwig-Maximilians-Universität München}, supervisors = {Korbinian Staudacher}, type = {Bachelorthesis}, }
  3. Ege Cimsir. Comparing Tensor Network Simulations for Quantum Fourier Transformation. 9 2024. Link to this entry
    BibTeX Entry
    @misc{cims24, author = {Ege Cimsir}, title = {{Comparing} {Tensor} {Network} {Simulations} for {Quantum} {Fourier} {Transformation}}, year = {2024}, key = {cims24}, month = {9}, school = {Ludwig-Maximilians-Universität München}, supervisors = {Korbinian Staudacher and Florian Krötz}, type = {Bachelorthesis}, }
  4. Adrian Mülthaler. Simulating Circuits in Quantum Natural Language Processing using Tensor Networks. 4 2024. Link to this entry PDF
    Abstract
    Quantum natural language processing deals with the implementation of natural language models on quantum hardware. It remains uncertain whether these models benefit from a quantum advantage or can be efficiently simulated on classical hardware. Tensor networks emerge as a promising tool in the efficient approximation of quantum states. This thesis explores the feasibility and potential advantages of employing tensor network simulation for quantum natural language processing. Using a matrix product state architecture, we investigate the complexity of simulating circuits obtained from the Categorical Compositional Distributional framework. Our results indicate an exponential complexity when these simulations are performed non-approximated. To analyze the impact of approximation on the performance of quantum natural language processing models, we introduce a binary classification task and train on two datasets, focusing on the difference between training with and without approximating the simulation. Our findings indicate that training with approximated simulation is possible, yet it exhibits greater instability and slower convergence.
    BibTeX Entry
    @misc{muel24, author = {Adrian Mülthaler}, title = {{Simulating} {Circuits} in {Quantum} {Natural} {Language} {Processing} using {Tensor} {Networks}}, year = {2024}, pdf = {https://bib.nm.ifi.lmu.de/pdf/muel24.pdf}, abstract = {Quantum natural language processing deals with the implementation of natural language models on quantum hardware. It remains uncertain whether these models benefit from a quantum advantage or can be efficiently simulated on classical hardware. Tensor networks emerge as a promising tool in the efficient approximation of quantum states. This thesis explores the feasibility and potential advantages of employing tensor network simulation for quantum natural language processing. Using a matrix product state architecture, we investigate the complexity of simulating circuits obtained from the Categorical Compositional Distributional framework. Our results indicate an exponential complexity when these simulations are performed non-approximated. To analyze the impact of approximation on the performance of quantum natural language processing models, we introduce a binary classification task and train on two datasets, focusing on the difference between training with and without approximating the simulation. Our findings indicate that training with approximated simulation is possible, yet it exhibits greater instability and slower convergence.}, key = {muel24}, month = {4}, school = {Ludwig-Maximilians-Universität München}, supervisors = {Korbinian Staudacher and Florian Krötz and Jakob Murauer and Axel Wisiorek}, type = {Bachelorthesis}, }
  5. Sergey Kassil. A lookahead strategy for ZX-based quantum circuit optimization. 12 2022. Link to this entry
    BibTeX Entry
    @misc{kass22, author = {Sergey Kassil}, title = {A lookahead strategy for {ZX-based} quantum circuit optimization}, year = {2022}, key = {kass22}, month = {12}, school = {Ludwig-Maximilians-Universität München}, supervisors = {Korbinian Staudacher and Sophia Grundner-Culemann}, type = {Bachelorthesis}, }
  6. Maximilian Weiß. Encoding strategies to solve Sudoku with Quantum Computers. 8 2022. Link to this entry
    BibTeX Entry
    @misc{weis22, author = {Maximilian Weiß}, title = {{Encoding} strategies to solve {Sudoku} with {Quantum} {Computers}}, year = {2022}, key = {weis22}, month = {8}, school = {Ludwig-Maximilians-Universität München}, supervisors = {Tobias Guggemos and Korbinian Staudacher and Sophia Grundner-Culemann}, type = {Bachelorthesis}, }

Disclaimer:

This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All person copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.

Last modified: Thu Oct 30 14:13:35 2025 CET