Current
- Dutch NWO NWA-ORC — Divide & Quantum (PI of a consortium)
- EU Horizon Europe — Efficient QUantum ALgorithms for IndusTrY (EQUALITY) (WP lead in a consortium)
- EU H2020 — NExt Applications of Quantum Computing (NEASQC) (Use-case lead in a consortium)
Past
- Dutch NWO Veni — Parallelism in Depth (PI)
- Austrian FWF — Lise Meitner
Current
- Divide & Quantum
I founded the Divide & Quantum consortium to develop and apply the divide & quantum technology, which empowers small quantum computers to solve large industrial-scale problems by using methods from parallel computing (divide & conquer). The consortium, consisting of various academic and industrial partners, will work to apply and scale quantum computing usign thsi method. My own team will work on the theory behind divide & quantum and the simulation of quantum computing based on LIMDDs.
- Efficient QUantum ALgorithms for IndusTrY (EQUALITY)
In this project, I lead the work package on quantum circuit optimization which will crucially enable other consortium partners to implement their quantum computing use cases.
- NExt Applications of Quantum Computing (NEASQC)
PhD candidate Sebastiaan Brand is in charge of the use-case on quantum probabilistic risk analysis. Together with EDF, the French energy conglomerate, we have applied quantum algorithms to speed up probabilistic inderence. To facilitate the transition, we also developed hybrid methods based on the decisison diagram techniques that EDF currently uses. Sebastiaan Brand has introduced new saturation-baed algorithms to compute reachability in decision diagrams, a central task in many approaches to probabilistic risk analysis.
Past
- Parallelism in Depth
In this project, we developed novel parallel and symbolic algorithms for automated verification and probabilistic risk analysis. Master student Vincent Bloemen designed and implemented a parallel on-the-fly SCC algorithm, which inspired Jayanti & Tarjan to develop supporting data structures. Master student Richard Huijbers realized a new method to symbolically compute bisimulation in parallel, showing that a relational approach is superior to a partition-based approach in the symbolci setting. PhD student Giso Dal found multiple ways to parallelize probabilistic inference by defining his own parallel WPBDD package and orthogonally splitting up probabilistic knowledge bases. Finally, PhD student Lieuwe Vinkhuijzen used sentential decision diagrams for model checking in the LTSmin tool.
- Heisenbugs
With Georg Weissenbacher (TU Vienna), I developed automated verification methods to detect and eliminate Heisenbugs: bugs that disappear when an attempt to locate and analyze them is made. For this purpose, we developed the Vienna Verification Tool (VVT), a SMT-based model checker. I contributed new ways to reduce parallel programs using transaction reduction, which have all been implemented in VVT.
- Multi-Core Model Checking
In this project, I designed parallel, online algorithms based on depth-first search (DFS), together with an array of supporting concurrent data structures. Using these algorithms and data structures, I showed that model checking can be parallelized for multi-core machines, contrary to the expectation of many researchers in the field. The research culminated in various new parallel verification algorithms, including for real-time systems, with collaborators from AalborgU (DK). The contributed techniques rapidly disseminated, finding their way into various other model checking tools, including NASA's SPIN model checker. My correctness proofs of the same parallel (online) algorithm have been used as case studies for theorem provers by other researchers and inspired course material on parallel algorithms in Amsterdam. All data structures and algorithms have been implemented in the model checking tool LTSmin.