Explore Temple’s High Performance Computing team and resources.
COMPUTE
Interactive High-Performance Computing Servers
Owl’s Nest is the latest Linux cluster for high-performance computing. It was procured and assembed in 2017.
It features 180x new dual-socket compute nodes with 28 cores and 128GB of RAM each. Research with larger memory requirements will benefit from 6x 512GB, 6x 1.5TB and 2x 3TB RAM machines. In addition, each 512GB box also hosts two NVIDIA P100 GPUs. An EDR InfiniBand (100Gb/s) fabric is used as interconnect.
In November 2018 the cluster was further extended with 48 more dual-socket compute nodes with 16 cores and 96GB of RAM each.
In total, Owl’s Nest currently hosts 6,464 CPU cores, providing about 57 million service units (CPU core hours) per year.
All of this is backed by a new 1.5PB parallel storage which hosts all user data and is shared across the entire cluster. For large public (read-only) data sets, there is an additional 0.5PB storage.
MACHINE LEARNING
Dedicated GPU servers for machine learning
The HPC team operates two GPU servers for intense GPU computing. These servers can be used to run optimized software stacks for neural networks and deep learning using Singularity containers. Users can run interactively and use containers for popular software packages like TensorFlow, Caffe2, PyTorch, and many more.
OWL'S NEST
High-Performance Computing cluster
Owl’s Nest is the latest Linux cluster for high-performance computing. It was procured and assembed in 2017.
It features 180x new dual-socket compute nodes with 28 cores and 128GB of RAM each. Research with larger memory requirements will benefit from 6x 512GB, 6x 1.5TB and 2x 3TB RAM machines. In addition, each 512GB box also hosts two NVIDIA P100 GPUs. An EDR InfiniBand (100Gb/s) fabric is used as interconnect.
In November 2018 the cluster was further extended with 48 more dual-socket compute nodes with 16 cores and 96GB of RAM each.
In total, Owl’s Nest currently hosts 6,464 CPU cores, providing about 57 million service units (CPU core hours) per year.
All of this is backed by a new 1.5PB parallel storage which hosts all user data and is shared across the entire cluster. For large public (read-only) data sets, there is an additional 0.5PB storage.
The High-Performance Computing (HPC) Team consists of two full-time staff and research faculty at the College of Science and Technology. We operate over 400 servers of shared HPC resources on campus.
Michael L. Klein
Dean and Laura H. Carnell Professor of Science
View Profile
Biography
My research interests focus on the use of theoretical & computational methods to study the properties of molecular solids and liquids and the self-assembly of both natural and synthetic macromolecules. In collaboration with brilliant students and postdocs, I have developed computational methodologies and carried out computer simulations to understand the phase behavior and properties of systems ranging from small molecules, such as water, to larger molecules such as surfactants and lipid bilayers, as well as natural and synthetic macromolecules. Other applications have been aimed at understanding the sensing and gating mechanisms Nature’s nano-scale machines, such as membrane-bound ion channels, as well as designing molecular therapeutics to combat both viral and bacterial infections.
Axel Kohlmeyer
Associate Dean
View Profile
Biography
Axel Kohlmeyer is Associate Director of the Institute for Computational Molecular Science, and the Leader of the High-Performance Computing (HPC) Team at Temple University in Philadelphia, USA.
He is also an External Associate of the International Centre for Theoretical Physics in Trieste, Italy, where he collaborates with ICTP HPC staff on HPC-related activities and teaches in the International HPC Master program hosted by ICTP and SISSA in Trieste.
His main research interests are in software development and engineering for scientific software in general and for simulation and analysis of molecular dynamics simulation on highly parallel modern computer architectures in particular. He also is interested in improving HPC education for scientists and engineers. He is a core developer and co-maintainer of the LAMMPS Molecular Dynamics softwarepackage and has contributed to several other popular scientific software packages like VMD, Quantum-Espresso, HOOMD-blue,CPMD or CP2k.
Past positions of his academic career include the International Centre for Theoretical Physics in Trieste, Italy, the Center for Molecular Modeling at the University of Pennsylvania in Philadelphia, USA, the Center for Theoretical Chemistry at the Ruhr-University Bochum, Germany, and the Department for Theoretical Chemistry at the University of Ulm, Germany.
He obtained his PhD (= Dr.rer.nat.) and Dipl.-Chem. degrees in Theoretical Chemisty at the University of Ulm under the supervision of Professor Wolfgang Witchel and Dr. Eckhard Spohr.
X-ray diffraction and computation yield the structure of alkanethiols on gold(111), A. Cossaro, R.Mazzarello, R. Rousseau, L. Casalis, A. Verdini, A. Kohlmeyer, L. Floreano, S. Scandolo, A. Morgante, M.L. Klein, G. Scoles, Science 321, 943-946 (2008). (Link)
LAMMPS – a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales, Aidan P. Thompson, H. Metin Aktulg, Richard Berger, Dan S.Bolintineanu, W. Michael Brown, Paul S. Crozier, Pieter J .in ‘t Veld, Axel Kohlmeyer, Stan G. Moore, Trung Dac Nguyen, RayShan, Mark J. Stevens, Julien Tranchida, Christian Trott, Steven J. Plimpton, Computer Physics Communications 271 (2022). Link
Fast Analysis of Molecular Dynamics Trajectories with Graphics Processing Units: Radial Distribution Function Histogramming, B.G. Levine, J.E. Stone, A. Kohlmeyer, J. Comput. Phys. 230(9), 3556-3569 (2011) (Link)
TopoGromacs: Automated Topology Conversion from CHARMM to GROMACS within VMD,Josh V. Vermaas, David J. Hardy, John E. Stone, Emad Tajkhorshid, and Axel Kohlmeyer, AC Publications (2016). Link
Vincenzo Carnevale
Associate Professor
View Profile
PhD: 2007, SISSA – Scuola Internazionale Superiore di Studi Avanzati, Trieste, Italy
Affiliation: Department of Biology, Temple University
Phone: 215-204-4214
Email: [email protected]
Office: SERC 710
Bio and Research description:
Vincenzo Carnevale received B.Sc. and M.Sc. degrees in Physics from the University of Pisa and a PhD from SISSA – Scuola Internazionale Superiore di Studi Avanzati in Trieste, Italy. He currently holds the position of Associate Professor in the Department of Biology.
The Carnevale research group uses statistical physics and machine learning approaches to investigate sequence-structure-function relations in proteins. A central theme of the group’s research is how interactions give rise to collective phenomena and complex emergent behaviors. At the level of genes, the group is interested in epistasis – the complex entanglement phenomenon that causes amino acids to evolve in a concerted fashion – and how this shapes molecular evolution. At the cellular level, the group investigates how intermolecular interactions drive biomolecules toward self-organization and pattern formation. A long-term goal of the group is understanding the molecular underpinnings of electrical signaling in excitable cells. Toward these goals, the group applies and actively develops an extensive arsenal of theoretical and computational approaches including statistical (mean)field theories, Monte Carlo and molecular dynamics simulations, statistical inference of generative models, and deep learning.