Data Intensive Computing – High Performance Computing
As one of the university computing centers in Germany that work together in the context of National High Performance Computing (NHR), the ZIH of TU Dresden offers “Capella”, a high-performance computer from Megware with a peak performance of more than 38 petaflop/s. The system comprises more than 140 nodes, each with 4 NVIDIA H100 accelerators and 2 AMD processors - high-speed scalability for research.
The HPC offering is complemented by approx. 75,000 processor cores from the “Barnard” cluster from Atos/Eviden and over 40 PB parallel file system.
The High Performance Computing and Storage Complex (HRSK-II) and its extension High Performance Computing – Data Analytics (HPC-DA) offers scientists about 60,000 CPU cores and a peak performance of more than 1.5 quadrillion floating point operations per second. The architecture specifically tailored to data-intensive computing, Big Data analytics, and artificial intelligence methods with extensive capabilities for energy measurement and performance monitoring provides ideal conditions to achieve the ambitious research goals of the users and the ZIH.
Access to the systems requires a project application.

High Performance Computing at NHR@TUD
As on of the centers for National High Performance Computing (NHR) ZIH offers special HPC resources as well as individual support and consulting. The systems are available to all scientists from all over Germany.
With its Nvidia H100 GPU accelerators, the “Capella” GPU cluster installed by Megware at the end of 2024 offers an excellent working platform for applications in the field of machine learning, language models and AI, for example. Larger models are easily processed by the 144 nodes (4 H100 GPUs and 2 AMD processors each) at top speed. The system is also designed for maximum energy efficiency - at the time of installation, it was ranked 5th in the Green500 worldwide.
The HPC system „Barnard“ with approx. 75,000 CPU cores, operated by ZIH, achieves a peak performance of more than 4,5 PFlop/s (quadrillion floating point operations per second). Additionally, the system is equipped with a shared storage of about 40 Petabyte. Therefore it provides the basis to successfully investigate compute-intensive and data-intensive problems of various scientific disciplines, like computational fluid dynamics, weather and climate modeling, material science, electrodynamics, life sciences, and bioinformatics.
Furthermore, systems with 192 GPU accelerators and 2PB fast flash memory are available especially for data analytics, machine learning, and artificial intelligence applications.
Software environments for various research topics are available on the system.
The national Big Data and AI competence center ScaDS.AI Dresden/Leipzig consults and supports users around the topics artificial intelligence, data analytics, and high performance computing. Primarily for users of ScaDS.AI ZIH operates an additional system with 312 Nvidia A100 GPUs optimized for machine learning applications.
The data center of the Lehmann Center (LZR) of the TU Dresden combines security and high availability with high power density and long term flexible usability. It is characterized by energy efficiency and, thus, cost efficiency. Hot water cooling for the HPC systems significantly reduces operating costs by eliminating the need for chillers. Additional savings derive from the reuse of computer-dissipated heat in the surrounding buildings. The cooling concept was awarded in 2014 the “German Computing Center Prize” in the category “Energy and Resource Efficient Computing Center”.