From 96e7d857cff2852f6b886345acc0e64b55463c59 Mon Sep 17 00:00:00 2001
From: Adam Caprez <acaprez2@unl.edu>
Date: Tue, 12 Oct 2021 20:41:28 -0500
Subject: [PATCH] Remove /home read-only references in current pages.

---
 content/FAQ/_index.md                            |  9 ++-------
 .../using_allinea_forge_via_reverse_connect.md   | 14 --------------
 .../applications/app_specific/running_theano.md  | 16 ----------------
 content/handling_data/data_storage/_index.md     |  3 +--
 .../data_storage/preventing_file_loss.md         |  6 ++----
 .../data_transfer/globus_connect/file_sharing.md |  5 ++---
 .../creating_an_interactive_job.md               |  4 ++--
 7 files changed, 9 insertions(+), 48 deletions(-)

diff --git a/content/FAQ/_index.md b/content/FAQ/_index.md
index 20c4db67..6f9f00a4 100644
--- a/content/FAQ/_index.md
+++ b/content/FAQ/_index.md
@@ -125,13 +125,8 @@ the command \`pwd\` into the terminal.
 (/home/group/user/)**:
 
 Move your files to your $WORK directory (/work/group/user) and resubmit
-your job.
-
-The worker nodes on our clusters have read-only access to the files in
-$HOME directories. This means that when a job is submitted from $HOME,
-the scheduler cannot write the output and error files in the directory
-and the job is killed. It appears the job does nothing because no output
-is produced.
+your job. The $HOME folder is not meant for job output. You may be attempting
+to write too much data from the job.
 
 **If you are running from inside your $WORK directory:**
 
diff --git a/content/applications/app_specific/allinea_profiling_and_debugging/using_allinea_forge_via_reverse_connect.md b/content/applications/app_specific/allinea_profiling_and_debugging/using_allinea_forge_via_reverse_connect.md
index 87bd3c09..cac80a90 100644
--- a/content/applications/app_specific/allinea_profiling_and_debugging/using_allinea_forge_via_reverse_connect.md
+++ b/content/applications/app_specific/allinea_profiling_and_debugging/using_allinea_forge_via_reverse_connect.md
@@ -44,20 +44,6 @@ Be sure to replace *demo02* with your HCC username.
 Click *OK* to close this dialog, and then *Close* on *Configure Remote
 Connections* to return back to the main Allinea window.
 
-Next, log in to Crane.  The Allinea software uses a `.allinea` directory
-in your home directory to store configuration information.  Since `/home`
-is read-only from the nodes in the cluster, the directory will be
-created in `/work` and symlink'd.  To do so, run the following commands:
-
-
-{{% panel theme="info" header="Create and symlink .allinea directory" %}}
-{{< highlight bash >}}
-rm -rf $HOME/.allinea
-mkdir -p $WORK/.allinea
-ln -s $WORK/.allinea $HOME/.allinea
-{{< /highlight >}}
-{{% /panel %}}
-
 ### Test the Reverse Connect feature
 
 To test the connection, choose *Crane* from the *Remote Launch* menu.
diff --git a/content/applications/app_specific/running_theano.md b/content/applications/app_specific/running_theano.md
index b493a42e..1a28bac4 100644
--- a/content/applications/app_specific/running_theano.md
+++ b/content/applications/app_specific/running_theano.md
@@ -7,22 +7,6 @@ Theano is available on HCC resources via the modules system. Both CPU and GPU
 versions are available on Crane.  Additionally, installs for both Python
 2.7 and 3.6 are provided.
 
-### Initial Setup
-
-Theano attempts to write to a `~/.theano` directory in some
-circumstances, which can cause errors as the `/home` filesystem is
-read-only on HCC machines.  As a workaround, create the directory on
-`/work` and make a symlink from `/home`:
-
-{{% panel theme="info" header="Create & symlink .theano directory" %}}
-{{< highlight bash >}}
-mkdir -p $WORK/.theano
-ln -s $WORK/.theano $HOME/.theano
-{{< /highlight >}}
-{{% /panel %}}
-
-This only needs to be done once on each HCC machine.
-
 ### Running the CPU version
 
 To use the CPU version, simply load the module and run your Python code.
diff --git a/content/handling_data/data_storage/_index.md b/content/handling_data/data_storage/_index.md
index 4016b287..d14df5f1 100644
--- a/content/handling_data/data_storage/_index.md
+++ b/content/handling_data/data_storage/_index.md
@@ -24,8 +24,7 @@ that take up relatively small amounts of space.  For example:  source
 code, program binaries, configuration files, etc.  This space is
 quota-limited to **20GB per user**.  The home directories are backed up
 for the purposes of best-effort disaster recovery.  This space is not
-intended as an area for I/O to active jobs.  **/home** is mounted
-**read-only** on cluster worker nodes to enforce this policy.
+intended as an area for I/O to active jobs. 
 
 ---
 ### Common Directory
diff --git a/content/handling_data/data_storage/preventing_file_loss.md b/content/handling_data/data_storage/preventing_file_loss.md
index 1ce628aa..648b690a 100644
--- a/content/handling_data/data_storage/preventing_file_loss.md
+++ b/content/handling_data/data_storage/preventing_file_loss.md
@@ -148,7 +148,7 @@ the cluster.
   
 Benefits:
 
--   No need to make manual backups. `\home` files are automatically backed
+-   No need to make manual backups. `/home` files are automatically backed
     up daily.
 -   Files in `/home` are not subject to the 6 month purge policy that
     exists on `/work`.
@@ -158,9 +158,7 @@ Limitations:
 
 -   Home storage is limited to 20GB per user. Larger files sets will
     need to be backed up using an alternate method.
--   Home is read-only on the cluster worker nodes so results cannot be
-    directly written or altered from within a submitted job.
-
+-   Home is low performance and not suitable for active job output.
   
 If you would like more information or assistance in setting up any of
 these methods, contact us
diff --git a/content/handling_data/data_transfer/globus_connect/file_sharing.md b/content/handling_data/data_transfer/globus_connect/file_sharing.md
index 754a66fc..c070782f 100644
--- a/content/handling_data/data_transfer/globus_connect/file_sharing.md
+++ b/content/handling_data/data_transfer/globus_connect/file_sharing.md
@@ -12,9 +12,8 @@ sharing only what is necessary and granting access only to trusted
 users.*
 
 {{% notice info %}}
-Shared endpoints created in your `home` directory on HCC servers (with
-the exception of Attic) are *read-only*. You may create readable and
-writable shared endpoints in your `work` directory (or `/shared`).
+Because of size and performance limitations, HCC does not recommend
+creating shares in your `home` directory.
 {{% /notice %}}
 
 1.  Sign in to your Globus account, click on the 'File Manager' tab
diff --git a/content/submitting_jobs/creating_an_interactive_job.md b/content/submitting_jobs/creating_an_interactive_job.md
index 2d902535..fc17e1c0 100644
--- a/content/submitting_jobs/creating_an_interactive_job.md
+++ b/content/submitting_jobs/creating_an_interactive_job.md
@@ -5,8 +5,8 @@ weight=20
 +++
 
 {{% notice info %}}
-The `/home` directories are read-only on the worker nodes. You will need
-to compile or run your processing in `/work`.
+The `/home` directories are not intended for active job I/O. 
+Output from run your processing should be directed to either `/work` or `/common`.
 {{% /notice %}}
 
 Submitting an interactive job is done with the command `srun`.
-- 
GitLab