chipathlon issueshttps://git.unl.edu/hcc/chipathlon/-/issues2017-07-06T16:46:34-05:00https://git.unl.edu/hcc/chipathlon/-/issues/36chip-meta-download should support resuming2017-07-06T16:46:34-05:00Adam Caprezchip-meta-download should support resumingDownloading all ~14k individual experiment JSON files can take multiple hours. Right now if the download of a single file fails, the entire download has be started over from the beginning. To support resuming via a command-line option,...Downloading all ~14k individual experiment JSON files can take multiple hours. Right now if the download of a single file fails, the entire download has be started over from the beginning. To support resuming via a command-line option, add a check to see if each experiment JSON file exists and has non-zero size. If so, skip fetching it and go to the next in the list.Adam CaprezAdam Caprezhttps://git.unl.edu/hcc/chipathlon/-/issues/35Workflow Job Parsing Update2018-09-13T18:09:40-05:00aknecht2Workflow Job Parsing UpdateSomehow, we never hit the case where we needed to optionally pass entire arguments to scripts. Since all arguments are always included, there is currently no way to handle #34. Some small updates to the workflow job parsing should be a...Somehow, we never hit the case where we needed to optionally pass entire arguments to scripts. Since all arguments are always included, there is currently no way to handle #34. Some small updates to the workflow job parsing should be able to handle this, but will require changes across several spots.
Change the input structure to be a dictionary not a list. Inputs will be passed by name instead of in a list. If an argument has a value that is None, we will skip adding it entirely. This will require updating the workflow_module yaml files to similarly be dictionaries instead of lists, as well as the internals of the workflow_module class to correctly pass information to the workflow_job classes.aknecht2aknecht2https://git.unl.edu/hcc/chipathlon/-/issues/34MongoDB auth should be optional2018-09-13T18:09:40-05:00Adam CaprezMongoDB auth should be optionalThe various `chip-*` helper scripts and the `db_save_result` / `download_from_gridfs` modules/scripts all require a username/password as required arguments. Mongo itself by default requires no authentication, so using creds should be op...The various `chip-*` helper scripts and the `db_save_result` / `download_from_gridfs` modules/scripts all require a username/password as required arguments. Mongo itself by default requires no authentication, so using creds should be optional params.Adam CaprezAdam Caprezhttps://git.unl.edu/hcc/chipathlon/-/issues/33.pyc files added as executables to DAX2018-09-13T18:09:40-05:00Adam Caprez.pyc files added as executables to DAXThe *.pyc files under `jobs/scripts` are getting added to the DAX as executables.The *.pyc files under `jobs/scripts` are getting added to the DAX as executables.Adam CaprezAdam Caprezhttps://git.unl.edu/hcc/chipathlon/-/issues/32Setuptools fixes2018-09-13T18:09:40-05:00Adam CaprezSetuptools fixesAdam CaprezAdam Caprezhttps://git.unl.edu/hcc/chipathlon/-/issues/31Remove module load from wrappers2018-09-13T18:09:40-05:00Adam CaprezRemove module load from wrappersRemove the HCC-specific module load statements from the wrapper scripts.Remove the HCC-specific module load statements from the wrapper scripts.Adam CaprezAdam Caprezhttps://git.unl.edu/hcc/chipathlon/-/issues/30Update Default Params2018-09-13T18:09:40-05:00aknecht2Update Default ParamsDefault params are not great -- most of them are just set to 2000 / 2000 memory & walltime. Make them sane.Default params are not great -- most of them are just set to 2000 / 2000 memory & walltime. Make them sane.aknecht2aknecht2https://git.unl.edu/hcc/chipathlon/-/issues/29Hidden Domains2018-09-13T18:09:40-05:00aknecht2Hidden Domains* [ ] Test this peak calling to verify how it works (including available peak options)
* [ ] Implement into the peak_call module to generate args correctly
* [ ] Add required post processing to get into a sorted/expected bed format.* [ ] Test this peak calling to verify how it works (including available peak options)
* [ ] Implement into the peak_call module to generate args correctly
* [ ] Add required post processing to get into a sorted/expected bed format.Peak Calling / Idrhttps://git.unl.edu/hcc/chipathlon/-/issues/28Documentation In-Depth2018-09-13T18:09:40-05:00aknecht2Documentation In-Depth* [ ] Examples
* [ ] Yaml markup explanations* [ ] Examples
* [ ] Yaml markup explanationsDocumentationaknecht2aknecht2https://git.unl.edu/hcc/chipathlon/-/issues/27Sphinx Building2018-09-13T18:09:40-05:00aknecht2Sphinx BuildingGet sphinx building working!Get sphinx building working!Documentationaknecht2aknecht2https://git.unl.edu/hcc/chipathlon/-/issues/26Zerone2018-09-13T18:09:40-05:00aknecht2Zerone* [ ] Test this peak calling to verify how it works (including available peak options)
* [ ] Implement into the peak_call module to generate args correctly
* [ ] Add required post processing to get into a sorted/expected bed format.* [ ] Test this peak calling to verify how it works (including available peak options)
* [ ] Implement into the peak_call module to generate args correctly
* [ ] Add required post processing to get into a sorted/expected bed format.Peak Calling / IdrNatasha PavlovikjNatasha Pavlovikjhttps://git.unl.edu/hcc/chipathlon/-/issues/25PePR2018-09-13T18:09:40-05:00aknecht2PePR* [ ] Test this peak calling to verify how it works (including available peak options)
* [ ] Implement into the peak_call module to generate args correctly
* [ ] Add required post processing to get into a sorted/expected bed format.* [ ] Test this peak calling to verify how it works (including available peak options)
* [ ] Implement into the peak_call module to generate args correctly
* [ ] Add required post processing to get into a sorted/expected bed format.Peak Calling / Idrhttps://git.unl.edu/hcc/chipathlon/-/issues/24MUSIC peak caller2018-09-13T18:09:40-05:00aknecht2MUSIC peak caller* [x] Test this peak calling to verify how it works (including narrow / medium / broad peak options)
* [x] Implement into the peak_call module to generate args correctly
* [x] Add required post processing to get into a sorted/expected be...* [x] Test this peak calling to verify how it works (including narrow / medium / broad peak options)
* [x] Implement into the peak_call module to generate args correctly
* [x] Add required post processing to get into a sorted/expected bed format.Peak Calling / Idraknecht2aknecht2https://git.unl.edu/hcc/chipathlon/-/issues/23Control output file transfer2018-09-13T18:09:40-05:00aknecht2Control output file transferCurrently transferring all output files. Only want to transfer sorted peak calling results / idr.Currently transferring all output files. Only want to transfer sorted peak calling results / idr.Peak Calling / Idraknecht2aknecht2https://git.unl.edu/hcc/chipathlon/-/issues/22Gem output files bug2018-09-13T18:09:40-05:00aknecht2Gem output files bugGem creates is own unique subdirectories so output files are not getting transferred / sorted correctly.Gem creates is own unique subdirectories so output files are not getting transferred / sorted correctly.Peak Calling / Idraknecht2aknecht2https://git.unl.edu/hcc/chipathlon/-/issues/21Parse ccat / peakranger output into bed files2018-09-13T18:09:40-05:00aknecht2Parse ccat / peakranger output into bed filesCurrently ccat / peakranger output parsing into bed files is not happening until after the workflow is run. Add the required post processing scripts to the workflow.Currently ccat / peakranger output parsing into bed files is not happening until after the workflow is run. Add the required post processing scripts to the workflow.Peak Calling / Idraknecht2aknecht2https://git.unl.edu/hcc/chipathlon/-/issues/20SPP Wrapper2018-09-13T18:09:40-05:00aknecht2SPP WrapperCurrently the spp wrapper is trying to gunzip and rezip everything with the .narrowPeak.gz extension. This is causing data corruption for several output files. Come up with a solution to the work around.Currently the spp wrapper is trying to gunzip and rezip everything with the .narrowPeak.gz extension. This is causing data corruption for several output files. Come up with a solution to the work around.Peak Calling / Idraknecht2aknecht2https://git.unl.edu/hcc/chipathlon/-/issues/19Update database saving2018-09-13T18:09:40-05:00aknecht2Update database savingCurrently there is special handling for bed / peak files, really we just want to drop them into gfs the same way we do with bam files.
Additionally we probably want to save multiple results per job -- Enable this type of saving.Currently there is special handling for bed / peak files, really we just want to drop them into gfs the same way we do with bam files.
Additionally we probably want to save multiple results per job -- Enable this type of saving.Peak Calling / Idraknecht2aknecht2https://git.unl.edu/hcc/chipathlon/-/issues/17IDR2018-09-13T18:09:40-05:00aknecht2IDRImplement IDR for pairs of sorted result files. IDR will have to be a separate workflow_module to fit logically.Implement IDR for pairs of sorted result files. IDR will have to be a separate workflow_module to fit logically.Peak Calling / Idraknecht2aknecht2https://git.unl.edu/hcc/chipathlon/-/issues/16Config parsing & validation2018-09-13T18:09:40-05:00aknecht2Config parsing & validationCurrently there is no validation being done on config files despite their being an expected format.Currently there is no validation being done on config files despite their being an expected format.Input Parsingaknecht2aknecht2