diff --git a/content/FAQ/_index.md b/content/FAQ/_index.md
index 20c4db67feb7b94a9b83c20a75801251e5c39217..6f9f00a4c8a8ad6e830c85e2a30dd758052a05e4 100644
--- a/content/FAQ/_index.md
+++ b/content/FAQ/_index.md
@@ -125,13 +125,8 @@ the command \`pwd\` into the terminal.
 (/home/group/user/)**:
 
 Move your files to your $WORK directory (/work/group/user) and resubmit
-your job.
-
-The worker nodes on our clusters have read-only access to the files in
-$HOME directories. This means that when a job is submitted from $HOME,
-the scheduler cannot write the output and error files in the directory
-and the job is killed. It appears the job does nothing because no output
-is produced.
+your job. The $HOME folder is not meant for job output. You may be attempting
+to write too much data from the job.
 
 **If you are running from inside your $WORK directory:**
 
diff --git a/content/applications/app_specific/allinea_profiling_and_debugging/using_allinea_forge_via_reverse_connect.md b/content/applications/app_specific/allinea_profiling_and_debugging/using_allinea_forge_via_reverse_connect.md
index 87bd3c09cc0c6cbc532a6a305942d1de6f389491..cac80a90b9a26ed054d41ec0c7a002531bec641a 100644
--- a/content/applications/app_specific/allinea_profiling_and_debugging/using_allinea_forge_via_reverse_connect.md
+++ b/content/applications/app_specific/allinea_profiling_and_debugging/using_allinea_forge_via_reverse_connect.md
@@ -44,20 +44,6 @@ Be sure to replace *demo02* with your HCC username.
 Click *OK* to close this dialog, and then *Close* on *Configure Remote
 Connections* to return back to the main Allinea window.
 
-Next, log in to Crane.  The Allinea software uses a `.allinea` directory
-in your home directory to store configuration information.  Since `/home`
-is read-only from the nodes in the cluster, the directory will be
-created in `/work` and symlink'd.  To do so, run the following commands:
-
-
-{{% panel theme="info" header="Create and symlink .allinea directory" %}}
-{{< highlight bash >}}
-rm -rf $HOME/.allinea
-mkdir -p $WORK/.allinea
-ln -s $WORK/.allinea $HOME/.allinea
-{{< /highlight >}}
-{{% /panel %}}
-
 ### Test the Reverse Connect feature
 
 To test the connection, choose *Crane* from the *Remote Launch* menu.
diff --git a/content/applications/app_specific/running_theano.md b/content/applications/app_specific/running_theano.md
index b493a42ef57179c26db18cf3735247ce833fc575..1a28bac43fbe6c193c2c048200c141133462dd2e 100644
--- a/content/applications/app_specific/running_theano.md
+++ b/content/applications/app_specific/running_theano.md
@@ -7,22 +7,6 @@ Theano is available on HCC resources via the modules system. Both CPU and GPU
 versions are available on Crane.  Additionally, installs for both Python
 2.7 and 3.6 are provided.
 
-### Initial Setup
-
-Theano attempts to write to a `~/.theano` directory in some
-circumstances, which can cause errors as the `/home` filesystem is
-read-only on HCC machines.  As a workaround, create the directory on
-`/work` and make a symlink from `/home`:
-
-{{% panel theme="info" header="Create & symlink .theano directory" %}}
-{{< highlight bash >}}
-mkdir -p $WORK/.theano
-ln -s $WORK/.theano $HOME/.theano
-{{< /highlight >}}
-{{% /panel %}}
-
-This only needs to be done once on each HCC machine.
-
 ### Running the CPU version
 
 To use the CPU version, simply load the module and run your Python code.
diff --git a/content/handling_data/data_storage/_index.md b/content/handling_data/data_storage/_index.md
index 4016b287790c588cf5d3a908a353853f3d13a7fd..d14df5f109c077c2271b06bce602238b7a25d8ee 100644
--- a/content/handling_data/data_storage/_index.md
+++ b/content/handling_data/data_storage/_index.md
@@ -24,8 +24,7 @@ that take up relatively small amounts of space.  For example:  source
 code, program binaries, configuration files, etc.  This space is
 quota-limited to **20GB per user**.  The home directories are backed up
 for the purposes of best-effort disaster recovery.  This space is not
-intended as an area for I/O to active jobs.  **/home** is mounted
-**read-only** on cluster worker nodes to enforce this policy.
+intended as an area for I/O to active jobs. 
 
 ---
 ### Common Directory
diff --git a/content/handling_data/data_storage/preventing_file_loss.md b/content/handling_data/data_storage/preventing_file_loss.md
index 1ce628aad84316e96cce4cfe7818e91e050902d7..648b690ae65978d194d9676dcb3f2e1f27c32f4f 100644
--- a/content/handling_data/data_storage/preventing_file_loss.md
+++ b/content/handling_data/data_storage/preventing_file_loss.md
@@ -148,7 +148,7 @@ the cluster.
   
 Benefits:
 
--   No need to make manual backups. `\home` files are automatically backed
+-   No need to make manual backups. `/home` files are automatically backed
     up daily.
 -   Files in `/home` are not subject to the 6 month purge policy that
     exists on `/work`.
@@ -158,9 +158,7 @@ Limitations:
 
 -   Home storage is limited to 20GB per user. Larger files sets will
     need to be backed up using an alternate method.
--   Home is read-only on the cluster worker nodes so results cannot be
-    directly written or altered from within a submitted job.
-
+-   Home is low performance and not suitable for active job output.
   
 If you would like more information or assistance in setting up any of
 these methods, contact us
diff --git a/content/handling_data/data_transfer/globus_connect/file_sharing.md b/content/handling_data/data_transfer/globus_connect/file_sharing.md
index 754a66fc9214984905d97322a182d2412745f048..c070782f9eb6005866c05509113a4286935296d2 100644
--- a/content/handling_data/data_transfer/globus_connect/file_sharing.md
+++ b/content/handling_data/data_transfer/globus_connect/file_sharing.md
@@ -12,9 +12,8 @@ sharing only what is necessary and granting access only to trusted
 users.*
 
 {{% notice info %}}
-Shared endpoints created in your `home` directory on HCC servers (with
-the exception of Attic) are *read-only*. You may create readable and
-writable shared endpoints in your `work` directory (or `/shared`).
+Because of size and performance limitations, HCC does not recommend
+creating shares in your `home` directory.
 {{% /notice %}}
 
 1.  Sign in to your Globus account, click on the 'File Manager' tab
diff --git a/content/submitting_jobs/creating_an_interactive_job.md b/content/submitting_jobs/creating_an_interactive_job.md
index 2d9025354c48daf4fb7cabc7aef5b08cd6cab874..fc17e1c097bf702c33da8dc9ab7023d3e353c2ee 100644
--- a/content/submitting_jobs/creating_an_interactive_job.md
+++ b/content/submitting_jobs/creating_an_interactive_job.md
@@ -5,8 +5,8 @@ weight=20
 +++
 
 {{% notice info %}}
-The `/home` directories are read-only on the worker nodes. You will need
-to compile or run your processing in `/work`.
+The `/home` directories are not intended for active job I/O. 
+Output from run your processing should be directed to either `/work` or `/common`.
 {{% /notice %}}
 
 Submitting an interactive job is done with the command `srun`.