@@ -4,9 +4,9 @@ title: "Facilities of the Holland Computing Center"
This document details the equipment resident in the Holland Computing Center (HCC) as of June 2022.
HCC has two primary locations directly interconnected by a 100 Gbps primary link with a 10 Gbps backup. The 1800 sq. ft. HCC machine room at the Peter Kiewit Institute (PKI) in Omaha can provide up to 500 kVA in UPS and genset protected power, and 160 ton cooling. A 2200 sq. ft. second machine room in the Schorr Center at the University of Nebraska-Lincoln (UNL) can currently provide up to 100 ton cooling with up to 400 kVA of power. Dell S4248FB-ON edge switches and Z9264F-ON core switches provide high WAN bandwidth and Software Defined Networking (SDN) capability for both locations. The Schorr and PKI machine rooms both have 100 Gbps paths to the University of Nebraska, Internet2, and ESnet as well as a 100 Gbps backup path. HCC uses multiple data transfer nodes as well as a FIONA (flash IO network appliance) to facilitate end-to-end performance for data intensive workflows.
HCC has two primary locations directly interconnected by a 100 Gbps primary link with a 10 Gbps backup. The 1800 sq. ft. HCC machine room at the Peter Kiewit Institute (PKI) in Omaha can provide up to 500 kVA in UPS and genset protected power, and 160 ton cooling. A 2200 sq. ft. second machine room in the Schorr Center at the University of Nebraska-Lincoln (UNL) can currently provide up to 100 ton cooling with up to 400 kVA of power. Dell S4248FB-ON edge switches and Z9264F-ON core switches provide high WAN bandwidth and Software Defined Networking (SDN) capability for both locations. The Schorr and PKI machine rooms both have 100 Gbps paths to the University of Nebraska, Internet2, and ESnet as well as a 100 Gbps geographically diverse backup path. HCC uses multiple data transfer nodes as well as a FIONA (Flash IO Network Appliance) to facilitate end-to-end performance for data intensive workflows.
HCC's main resources at UNL include Red, a high throughput cluster for high energy physics, and hardware supporting the PATh, PRP, and OSG NSF projects. The largest machine on the Lincoln campus is Red, with 15,984 job slots interconnected by a mixture of 1, 10, 25, 40, and 100 Gbps Ethernet. Red serves up over 11 PB of storage using the CEPH filesystem. Red primarily serves as a major site for storage and analysis in the international high energy physics project known as CMS (Compact Muon Solenoid) and is integrated with the Open Science Grid (OSG).
HCC's main resources at UNL include Red, a high throughput cluster for high energy physics, and hardware supporting the PATh, PRP, and OSG NSF projects. Red is the largest machine on the Lincoln campus with 15,984 job slots interconnected by a mixture of 1, 10, 25, 40, and 100 Gbps Ethernet. Red serves up over 11 PB of storage using the CEPH filesystem. Red primarily serves as a major site for storage and analysis in the international high energy physics project known as CMS (Compact Muon Solenoid) and is integrated with the Open Science Grid (OSG).
Other resources at UNL include hardware supporting the PATh, PRP, and OSG projects as well as the off-site replica of the Attic archival storage system.
...
...
@@ -18,7 +18,7 @@ Crane debuted at 474 on the Top500 list with an HPL benchmark or 121.8 TeraFLOPS
Anvil is an OpenStack cloud environment consisting of 1,520 cores and 400TB of CEPH storage all connected by 10 Gbps networking. The Anvil cloud exists to address needs of NU researchers that cannot be served by traditional scheduler-based HPC environments such as GUI applications, Windows based software, test environments, and persistent services.
Attic and Silo form a near line archive with 1.0 PB of usable storage. Attic is located at PKI in Omaha, while Silo acts as an online backup located in Lincoln. Both Attic and Silo are connected with 10 Gbps network connections.
Attic and Silo form a near line archive with 3PB of usable storage. Attic is located at PKI in Omaha, while Silo acts as an online backup located in Lincoln. Both Attic and Silo are connected with 10 Gbps network connections.
In addition to the cluster specific Lustre storage, a shared storage space known as Common exists between all HCC resources with 1.9PB capacity.
...
...
@@ -60,11 +60,13 @@ These resources are detailed further below.