- Member of the Leibniz Association
- Collaboration with University of Rostock
- Organizational structure
- Career and Family
- Contact / How to reach us
- Photos / Videos
The IAP has an excellent computing infrastructure available. A High Performance Computing System embedded in a network with up to 100GBit/s bandwidth builds the core.
The IAP is connected redundant to the internet via the Deutsches Forschungsnetz (DFN). The bandwidth of the connections are each 1.5 GBit/s. Corresponding equipment secures the access to the IAP network. There are used different technologies to prevent unauthorized access, impact of viruses and spam.
Demanding scientific tasks require demanding computer equipment placed in the High Performance Computing (HPC) segment at IAP. Central resource is a computer system of the company of HPe by type of Superdome Flex. It consists of 2 systems with 32 and 8 nodes with 928 cores and 7.5 TByte memory in summary. The PBSpro batch system from Altair ensures efficient distribution of computing power to every computer.
Not least caused by these HPC computers the need of storage capacity is immense at IAP. Scientific experiments and measurements require and produce enormous amounts of data. To cope with corresponding data the IAP has a magnetic tape robot system, a so called tape library by the company of Quantum available. The current capacity of 4 PByte in the moment is controlled by a computer system from the company HPe.
Each of the three departments of the IAP has a smaller Server system as well as a multitude of desktop and experimental PC's for more important tasks available.
A large part of the server systems and computers is virtualized. So we have significant better failover security and efficiency. We use technologies like MS Hyper-V, Vmware or KVM, based on platforms like Cisco’s UCS.