... | ... | @@ -8,14 +8,14 @@ |
|
|
|
|
|
- In order for the above three things to work, the remote site name will also need to be given as an argument to `chip-gen`, so that when adding remote input files or executables to the DAX the location is correctly specified.
|
|
|
|
|
|
- Default resource requirements are way off.
|
|
|
- Default resource requirements are way off. (Avi mostly fixed)
|
|
|
|
|
|
- The BOSCO installer for a remote resource doesn't include the helper script to translate Pegasus resource specifications to SLURM job attributes (`slurm_local_submit_attributes.sh`). Not really a chipathlon problem, but needs fixed.
|
|
|
- The BOSCO installer for a remote resource doesn't include the helper script to translate Pegasus resource specifications to SLURM job attributes (`slurm_local_submit_attributes.sh`). Not really a chipathlon problem, but needs fixed. (Adam will fix BOSCO package)
|
|
|
|
|
|
- The code that adds things in the `jobs/scripts` directory as executables is also picking up the `.pyc` byte-compiled files. Doesn't really break anything, but should be cleaned up.
|
|
|
- The code that adds things in the `jobs/scripts` directory as executables is also picking up the `.pyc` byte-compiled files. Doesn't really break anything, but should be cleaned up.
|
|
|
|
|
|
- The Picard version used (1.139) is quite old. Would be nice to be able to use the current 2.9.0 version.
|
|
|
- The Picard version used (1.139) is quite old. Would be nice to be able to use the current 2.9.0 version. (Natasha will test)
|
|
|
|
|
|
- The two SPP scripts (`run_spp.R` and `run_spp_nodups.R`) need to be packaged up in Conda in a sane way.
|
|
|
- The two SPP scripts (`run_spp.R` and `run_spp_nodups.R`) need to be packaged up in Conda in a sane way. (Adam will do)
|
|
|
|
|
|
- How to distribute the MongoDB stuff in a sane way. Docker is currently the leading candidate. The size of the experiments and samples collection is a little over 1GB, which isn't terribly large. We could create a container with Mongo installed and the DB pre-populated, and then include scripts to do the update from ENCODE. |