@@ -6,9 +6,9 @@ This document details the equipment resident in the Holland Computing Center (HC
...
@@ -6,9 +6,9 @@ This document details the equipment resident in the Holland Computing Center (HC
HCC has two primary locations directly interconnected by a 100 Gbps primary link with a 10 Gbps backup. The 1800 sq. ft. HCC machine room at the Peter Kiewit Institute (PKI) in Omaha can provide up to 500 kVA in UPS and genset protected power, and 160 ton cooling. A 2200 sq. ft. second machine room in the Schorr Center at the University of Nebraska-Lincoln (UNL) can currently provide up to 100 ton cooling with up to 400 kVA of power. Dell S4248FB-ON edge switches and Z9264F-ON core switches provide high WAN bandwidth and Software Defined Networking (SDN) capability for both locations. The Schorr and PKI machine rooms both have 100 Gbps paths to the University of Nebraska, Internet2, and ESnet as well as a 100 Gbps geographically diverse backup path. HCC uses multiple data transfer nodes as well as a FIONA (Flash IO Network Appliance) to facilitate end-to-end performance for data intensive workflows.
HCC has two primary locations directly interconnected by a 100 Gbps primary link with a 10 Gbps backup. The 1800 sq. ft. HCC machine room at the Peter Kiewit Institute (PKI) in Omaha can provide up to 500 kVA in UPS and genset protected power, and 160 ton cooling. A 2200 sq. ft. second machine room in the Schorr Center at the University of Nebraska-Lincoln (UNL) can currently provide up to 100 ton cooling with up to 400 kVA of power. Dell S4248FB-ON edge switches and Z9264F-ON core switches provide high WAN bandwidth and Software Defined Networking (SDN) capability for both locations. The Schorr and PKI machine rooms both have 100 Gbps paths to the University of Nebraska, Internet2, and ESnet as well as a 100 Gbps geographically diverse backup path. HCC uses multiple data transfer nodes as well as a FIONA (Flash IO Network Appliance) to facilitate end-to-end performance for data intensive workflows.
HCC's main resources at UNL include Red, a high throughput cluster for high energy physics, and hardware supporting the PATh, PRP, and OSG NSF projects. Red is the largest machine on the Lincoln campus with 15,984 job slots interconnected by a mixture of 1, 10, 25, 40, and 100 Gbps Ethernet. Red serves up over 11 PB of storage using the CEPH filesystem. Red primarily serves as a major site for storage and analysis in the international high energy physics project known as CMS (Compact Muon Solenoid) and is integrated with the Open Science Grid (OSG).
HCC's main resources at UNL include Red, a high throughput cluster for high energy physics, and hardware supporting the Partnership to Advance Throughput Computing (PATh), National Research Platform (NRP), and OSG NSF projects. Red is the largest machine on the Lincoln campus with 15,984 job slots interconnected by a mixture of 1, 10, 25, 40, and 100 Gbps Ethernet. Red serves up over 11 PB of storage using the CEPH filesystem. Red primarily serves as a major site for storage and analysis in the international high energy physics project known as CMS (Compact Muon Solenoid) and is integrated with the Open Science Grid (OSG).
Other resources at UNL include hardware supporting the PATh, PRP, and OSG projects as well as the off-site replica of the Attic archival storage system.
Other resources at UNL include hardware supporting the PATh, NRP, and OSG projects as well as the off-site replica of the Attic archival storage system.
HCC's resources at PKI (Peter Kiewit Institute) in Omaha include the Swan, Crane and Anvil clusters along with the Attic and Common storage services.
HCC's resources at PKI (Peter Kiewit Institute) in Omaha include the Swan, Crane and Anvil clusters along with the Attic and Common storage services.