Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Example of how to write data files as series of submodels. #18

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

nrnhines
Copy link
Member

@nrnhines nrnhines commented Feb 3, 2021

@nrnhines nrnhines requested a review from pramodk February 3, 2021 22:07
test_submodel.py Outdated
# write out the files.dat file
def write_files_dat(coredat, gidgroups):
f = open(coredat+"/files.dat", "w")
f.write("1.4\n") # CoreNEURON data version
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@nrnhines : can we get version as property or via some method? We have same problem of hard coding it in neurodamus and I wonder if it could be done in better way.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was thinking the same thing. The first thing that comes to mind is yet another ParallelContext method such as pc.nrncore_data_version() or perhaps a more generic thing such as pc.nrncore_property('property_name')
since I would also like to know if coreneuron is available, the gpu is available, etc. Also the documentation for pc.nrnbbcore_write(...) is missing the first line data version requirement for files.dat.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or, as an end user, I don't want to manage combining these models like this. What I want to say is take these subdir_1 subdir_2 subdir_3 and simulate it. Internally we can do necessary symlinks and group files.dat.

Another possibility: coreneuron can be also updated to accept multiple data directories! (I am now thinking more about this possibility!)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should discuss by zoom. I'm not seeing the value of multiple data directories for this problem since I can't envision the filenames not being unique regardless of the number of submodels. You are right that files.dat can easily be handled by nrnbbcore_write but need to work out some api details. Maybe the user does not have to be aware of the gidgroups array of Vector either.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I propose extending nrnbbcore_write with a new signature

pc.nrnbbcore_write([path, [i_submodel, n_submodel])

which indicates how many submodels are involved and at which point in the sequence this call is embedded. (when i_submodel == (nsubmodel - 1), the files.dat file is written at the end. Other possibilities are to count down from a beginning n_submodel to 0 and when 0 is reached, emit the files.dat file. In any event we would like for this to work on an mpi cluster and/or threads. It would also be nice to support the submodel strategy for the case of separate launch of NEURON for groups of submodels. That would require that the accumulating groupgid information persist across launches which would mean writing some kind of intermediate file or else a files.dat to which nrnbbcore_write appends further gidgroup information as well as update the second line of the file (ngroup). If this latter is implemented then the signature could be further simplified to

pc.nrnbbcore_write([path], [bool append] )

which means that files.dat is to be updated at the end of this call and so is always valid when the call exits.
Lastly, I could imagine that the optional bool append could be eliminated and default is always append and it would be
up to the user to clear out the "outdat" folder or start a new one if wanting to start from the beginning.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And is this a good time to allow the synonym, pc.nrncore_write([path], [bool append])?
Note, in test_submodel.py this would change the statement to

pc.nrncore_write('./coredat', isubmodel != 0)

and would eliminate write_files_dat(...)

@pramodk
Copy link
Member

pramodk commented Feb 5, 2021

You are right that files.dat can easily be handled by nrnbbcore_write but need to work out some api details.

Ah yes! we don't need multiple directories!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Network size exceeds the DRAM capacity and program gets killed when exporting the network with nrnbbcore_write
2 participants