raven
raven copied to clipboard
Fix WorkingDir of Driver.py
Pull Request Description
What issue does this change request address? (Use "#" before the issue to link it, i.e., #42.)
#1033
What are the significant changes in functionality due to this change request?
No significant changes. RAVEN will now adjust its working directory to be wherever the input file is pointed.
For example, before this would break raven:
cd raven
./raven_framework tests/framework/user_guide/ravenTutorial/singleRun.xml
Now it doesn't.
For Change Control Board: Change Request Review
The following review must be completed by an authorized member of the Change Control Board.
- [ ] 1. Review all computer code.
- [ ] 2. If any changes occur to the input syntax, there must be an accompanying change to the user manual and xsd schema. If the input syntax change deprecates existing input files, a conversion script needs to be added (see Conversion Scripts).
- [ ] 3. Make sure the Python code and commenting standards are respected (camelBack, etc.) - See on the wiki for details.
- [ ] 4. Automated Tests should pass, including run_tests, pylint, manual building and xsd tests. If there are changes to Simulation.py or JobHandler.py the qsub tests must pass.
- [ ] 5. If significant functionality is added, there must be tests added to check this. Tests should cover all possible options. Multiple short tests are preferred over one large test. If new development on the internal JobHandler parallel system is performed, a cluster test must be added setting, in <RunInfo> XML block, the node
<internalParallel>
to True. - [ ] 6. If the change modifies or adds a requirement or a requirement based test case, the Change Control Board's Chair or designee also needs to approve the change. The requirements and the requirements test shall be in sync.
- [ ] 7. The merge request must reference an issue. If the issue is closed, the issue close checklist shall be done.
- [ ] 8. If an analytic test is changed/added is the the analytic documentation updated/added?
- [ ] 9. If any test used as a basis for documentation examples (currently found in
raven/tests/framework/user_guide
andraven/docs/workshop
) have been changed, the associated documentation must be reviewed and assured the text matches the example.
@dylanjm Do you have a change to resolve this issue?
@wangcj05 I'm having trouble finding where we read files in the code that still has access to the RunInfo node.
@dylanjm the path in the Code.py is assigned in initialize method, the 'subSubDirectory', is it the place that you are looking for?
@wangcj05 This could be the place I need to add it. My only question is are there files that won't go into Models that we need to check different directories for? Seems like this code is only checking the files that are passed as inputs to a Models block.
@dylanjm do you by chance have time to fix this issue?
@dylanjm do you by chance have time to fix this issue?
@wangcj05 Ahh, yes. This PR had skipped my mind. I'll tackle it right now.
@wangcj05 A few things to note:
- This change fixes the error you were running into with
raven/tests/framework/test_iostep_load.xml
You can run this test fromraven/tests
orraven/tests/framework
and it will pass. - The GA test was not passing because the External Model path was not relative to the working directory specified in the input file. It's also weird to name the working directory the same as another directory you want to put the external model in. We should encourage users to place the file in a premade working directory next to the input file or have them specify the relative path from the input file.
- There is no central place in RAVEN (that I can find) that would allow us to check for the existence of files in multiple locations. We also allow users to specify files that don't even exist yet.
This fix allows users to run RAVEN from whichever directory they'd like but keeps the expected output the same no matter what. As long as users specify their files relative to the working directory.
I may be missing something, so I'm open to any feedback you have on the matter.
Checklist is good, and tests are green. PR can be merged. @dylanjm Thanks for your contribution.