Verified Commit 96e7d857 authored by Adam Caprez's avatar Adam Caprez
Browse files

Remove /home read-only references in current pages.

parent 5df74a3a
...@@ -125,13 +125,8 @@ the command \`pwd\` into the terminal. ...@@ -125,13 +125,8 @@ the command \`pwd\` into the terminal.
(/home/group/user/)**: (/home/group/user/)**:
Move your files to your $WORK directory (/work/group/user) and resubmit Move your files to your $WORK directory (/work/group/user) and resubmit
your job. your job. The $HOME folder is not meant for job output. You may be attempting
to write too much data from the job.
The worker nodes on our clusters have read-only access to the files in
$HOME directories. This means that when a job is submitted from $HOME,
the scheduler cannot write the output and error files in the directory
and the job is killed. It appears the job does nothing because no output
is produced.
**If you are running from inside your $WORK directory:** **If you are running from inside your $WORK directory:**
......
...@@ -44,20 +44,6 @@ Be sure to replace *demo02* with your HCC username. ...@@ -44,20 +44,6 @@ Be sure to replace *demo02* with your HCC username.
Click *OK* to close this dialog, and then *Close* on *Configure Remote Click *OK* to close this dialog, and then *Close* on *Configure Remote
Connections* to return back to the main Allinea window. Connections* to return back to the main Allinea window.
Next, log in to Crane.  The Allinea software uses a `.allinea` directory
in your home directory to store configuration information.  Since `/home`
is read-only from the nodes in the cluster, the directory will be
created in `/work` and symlink'd.  To do so, run the following commands:
{{% panel theme="info" header="Create and symlink .allinea directory" %}}
{{< highlight bash >}}
rm -rf $HOME/.allinea
mkdir -p $WORK/.allinea
ln -s $WORK/.allinea $HOME/.allinea
{{< /highlight >}}
{{% /panel %}}
### Test the Reverse Connect feature ### Test the Reverse Connect feature
To test the connection, choose *Crane* from the *Remote Launch* menu. To test the connection, choose *Crane* from the *Remote Launch* menu.
......
...@@ -7,22 +7,6 @@ Theano is available on HCC resources via the modules system. Both CPU and GPU ...@@ -7,22 +7,6 @@ Theano is available on HCC resources via the modules system. Both CPU and GPU
versions are available on Crane.  Additionally, installs for both Python versions are available on Crane.  Additionally, installs for both Python
2.7 and 3.6 are provided. 2.7 and 3.6 are provided.
### Initial Setup
Theano attempts to write to a `~/.theano` directory in some
circumstances, which can cause errors as the `/home` filesystem is
read-only on HCC machines.  As a workaround, create the directory on
`/work` and make a symlink from `/home`:
{{% panel theme="info" header="Create & symlink .theano directory" %}}
{{< highlight bash >}}
mkdir -p $WORK/.theano
ln -s $WORK/.theano $HOME/.theano
{{< /highlight >}}
{{% /panel %}}
This only needs to be done once on each HCC machine.
### Running the CPU version ### Running the CPU version
To use the CPU version, simply load the module and run your Python code. To use the CPU version, simply load the module and run your Python code.
......
...@@ -24,8 +24,7 @@ that take up relatively small amounts of space. For example: source ...@@ -24,8 +24,7 @@ that take up relatively small amounts of space. For example: source
code, program binaries, configuration files, etc. This space is code, program binaries, configuration files, etc. This space is
quota-limited to **20GB per user**. The home directories are backed up quota-limited to **20GB per user**. The home directories are backed up
for the purposes of best-effort disaster recovery. This space is not for the purposes of best-effort disaster recovery. This space is not
intended as an area for I/O to active jobs. **/home** is mounted intended as an area for I/O to active jobs.
**read-only** on cluster worker nodes to enforce this policy.
--- ---
### Common Directory ### Common Directory
......
...@@ -148,7 +148,7 @@ the cluster. ...@@ -148,7 +148,7 @@ the cluster.
Benefits: Benefits:
- No need to make manual backups. `\home` files are automatically backed - No need to make manual backups. `/home` files are automatically backed
up daily. up daily.
- Files in `/home` are not subject to the 6 month purge policy that - Files in `/home` are not subject to the 6 month purge policy that
exists on `/work`. exists on `/work`.
...@@ -158,9 +158,7 @@ Limitations: ...@@ -158,9 +158,7 @@ Limitations:
- Home storage is limited to 20GB per user. Larger files sets will - Home storage is limited to 20GB per user. Larger files sets will
need to be backed up using an alternate method. need to be backed up using an alternate method.
- Home is read-only on the cluster worker nodes so results cannot be - Home is low performance and not suitable for active job output.
directly written or altered from within a submitted job.
If you would like more information or assistance in setting up any of If you would like more information or assistance in setting up any of
these methods, contact us these methods, contact us
......
...@@ -12,9 +12,8 @@ sharing only what is necessary and granting access only to trusted ...@@ -12,9 +12,8 @@ sharing only what is necessary and granting access only to trusted
users.* users.*
{{% notice info %}} {{% notice info %}}
Shared endpoints created in your `home` directory on HCC servers (with Because of size and performance limitations, HCC does not recommend
the exception of Attic) are *read-only*. You may create readable and creating shares in your `home` directory.
writable shared endpoints in your `work` directory (or `/shared`).
{{% /notice %}} {{% /notice %}}
1. Sign in to your Globus account, click on the 'File Manager' tab 1. Sign in to your Globus account, click on the 'File Manager' tab
......
...@@ -5,8 +5,8 @@ weight=20 ...@@ -5,8 +5,8 @@ weight=20
+++ +++
{{% notice info %}} {{% notice info %}}
The `/home` directories are read-only on the worker nodes. You will need The `/home` directories are not intended for active job I/O.
to compile or run your processing in `/work`. Output from run your processing should be directed to either `/work` or `/common`.
{{% /notice %}} {{% /notice %}}
Submitting an interactive job is done with the command `srun`. Submitting an interactive job is done with the command `srun`.
......
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment