Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
The aim of this PR is to allow multiple model/configuration analysis with the use config/yaml file, follows from #214, #215 and b4fd564. However there are several pitfalls when using MPI:
mpiexec
theinjection.log
for the first configuration is populated by all the logs. And no log files are created for subsequent analysis.mpiexec
, while individual configurations are running parallely but the multiple configurations are executed sequentially. To solve this, this PR is created.--parallel
flag is introduced which upon using distributes the given number of processess (specified usingmpiexec -np
) amongst all the configurations and try to execute them in parallel. However this does not seem to work. Here are two differnent cases now.comm.Barrier()
andMPI.Finalize()
are used then the program gets stuck with file not found errors (for the 2nd configuration)--parallel
is not used (butmpiexec
is), the beaviour is somewhat erratic; analysis of first configuration is completed and gets stuck while executing the later.For now I would recommended not to use yaml file for multiple configuration analysis, since it might break somewhere in between.
Use the given commands to duplicate the issues:
mpiexec -np 10 light_curve_analysis --config config.yaml
mpiexec -np 10 light_curve_analysis --config config.yaml --parallel
comm.Barrier()
andMPI.Finalize()
and againmpiexec -np 10 light_curve_analysis --config config.yaml --parallel
I would like everyone who knows MPI (possibly much more than me) to have a look on this.
Also, DO NOT MERGE :')