Skip to content
GitLab
Menu
Projects
Groups
Snippets
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
Menu
Open sidebar
Holland Computing Center
HCC docs
Commits
f19d977a
Commit
f19d977a
authored
Jun 14, 2019
by
Carrie A Brown
Browse files
Added memory to GPU listing
parent
a39c8eb2
Changes
1
Hide whitespace changes
Inline
Side-by-side
content/guides/submitting_jobs/submitting_cuda_or_openacc_jobs.md
View file @
f19d977a
...
@@ -9,12 +9,12 @@ Crane has four types of GPUs available in the **gpu** partition. The
...
@@ -9,12 +9,12 @@ Crane has four types of GPUs available in the **gpu** partition. The
type of GPU is configured as a SLURM feature, so you can specify a type
type of GPU is configured as a SLURM feature, so you can specify a type
of GPU in your job resource requirements if necessary.
of GPU in your job resource requirements if necessary.
| Description | SLURM Feature | Available Hardware |
| Description | SLURM Feature | Available Hardware |
| -------------------- | ------------- | ---------------------------- |
| -------------------- | ------------- | ---------------------------- |
| Tesla K20, non-IB | gpu_k20 | 3 nodes - 2 GPUs per node
|
| Tesla K20, non-IB | gpu_k20 | 3 nodes - 2 GPUs
with 4 GB mem
per node |
| Teska K20, with IB | gpu_k20 | 3 nodes - 3 GPUs per node |
| Teska K20, with IB | gpu_k20 | 3 nodes - 3 GPUs
with 4 GB mem
per node |
| Tesla K40, with IB | gpu_k40 | 5 nodes - 4 K40M GPUs per node
<br>
1 node - 2 K40C GPUs |
| Tesla K40, with IB | gpu_k40 | 5 nodes - 4 K40M GPUs
with 11 GB mem
per node
<br>
1 node - 2 K40C GPUs |
| Tesla P100, with OPA | gpu_p100 | 2 nodes - 2 GPUs per node |
| Tesla P100, with OPA | gpu_p100 | 2 nodes - 2 GPUs
with 12 GB
per node |
To run your job on the next available GPU regardless of type, add the
To run your job on the next available GPU regardless of type, add the
...
...
Write
Preview
Supports
Markdown
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment