-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Foundational changes for converting Hazel in a test bench of radiative transfer methods and theories. #28
Open
edgecarlin
wants to merge
16
commits into
aasensio:multiatom
Choose a base branch
from
edgecarlin:multiatom
base: multiatom
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
…dd_chromosphere. Added possibility of reading j10 for every atomic transition as a vector (implying changes in several subroutines). The ff parameter and j10 are now optional keywords in set_parameters. Added sodium HFS atom file and subroutine for reading generic atom. Improved efficiency in IF..ELSE structures setting up reference frame en add_chromosphere and add_photosphere. Added shorter aliases to some keywords (coordB,ref frame) and subroutines (add_chromosphere,set_parameters). Added possibility of returning mod.chromospheres dictionary object in add_chromo/spheres subroutine (for minimalistic syntaxis). Now Verbose parameter is set from Init subroutine and connected with verbose_mode. Added test HeI .py file in test directory for showing the new nomencleture and comments on the changes.
…e atom structure was commented. Atomic file is now introduced dynamically from script python (has been checked only programatically). For this, I passed from python string to vector of chars in f90 through pyx file. The name of the file is passed through the init subroutine when Hazel module is initiated in model.py, and the path is assumed to be ../hazel/data.
…e iplot_stokes for compact basic plotting outside and inside for loops
…s in the chromospheric transfer. The relevant piece of code to spot is marked in model.py.
…l or cartesian) for the magnetic field vector parameters introduced in add_cromosphere. Addition of a lot of small developer comments for preparing the following changes.Verified that the radiative transfer works ok in loop of synthesize_spectral_region.
…e define a dictionary of dictionaries with the atoms and lines and indexes of the spectral lines to choose among in model.py. The defintion of line_to_index is then done reading from the dictionary from add_spectral, and in order to pass it to the atmosphere object and to Hazel synthesize in chromosphere.py we do the following in the radiative transfer logic of synthesize_spectral_region: self.atmospheres[atm].line_to_index=self.line_to_index.
… object from hazel and SIR atmospheres.
…dd_chromosphere, finally disentangling atmopsheric aspects from spectral ones and reducing the verbosity and complexity of the code. To this aim, add_active_line is now done for every atmosphere inside add_spectral(right after adding all atmospheres of the topology).Before, both add_active_line and the block before appeared repeated for every kind of adding atmosphere routine,preventing to change the order of add_spectral and add_chromosphere. Now n_chromospheres can be known and calculated from add_spectral, before setup, which is the requirement for starting to program the storage of all optical coefficients.
… it works with keywords instead of with input dictionaries.Remove name from input dictionaries of add_spectrum , leaving it as fixed parameter.Add additional reduced keyword 'boundary' instead of 'boundary condition' in add_spectrum.Create containers for optical coefficients in spectrum object. Modify fortran routines to extract optical coefficients of every chromosphere to Python. Check transfer in synthesize_spectral_region. Add routine for adding N chromospheres in one line. Add plotting routines for optical coefficients.
…Now there is a dictionary with multiplets at the beginning of model.py, such that all atom-related information is together and more clear.Change to sodium atom and create test program. Improved efficiency and readibility of magnetic field transformations in synthesize of chromosphere.py calling Hazel.Addition of subroutines cartesian_to_spherical, spherical_to_cartesian, and los_to_vertical.Addition of get_B_Hazel and simplification of the treatment of the magnetic field reference and coordinates in add_parameters and synthesize inside chromosphere.py. CHecked that j10 is correctly introduced(OK).Checked that effectively linear anisotropy is introducing J10 in the SEE of sodium(OK).Reduced parsInput to parsIn names in synthesize at chromosphere.py.Minor simplifications in synthesize_spectral_region.
… name of the spectra to the topologies, such that the outer loop and the if clause in synthesize_spectral_region can be avoided. Now the loop does not need to search over all atmospheres but just go through the ones belonging to that spectral region of interest. Fix bug when storing optical coefficients in topologies of the kind c0->c1+c2. Solved using index n+k in synthesize_spectral_region. Simplification of the routine synthesize_spectral_region into a new routine called synthesize_spectrum, where unnecessary operations and code lines are removed. Detected bug in continuum of Stokes I (being 2.0 instead of 1.0) in topologies of the kind c0->c1+c2. This actually happens when the user forget to specify the ff's for the atmospheres in set_pars or set_parameters. Possible bug in old routine normalize_ff, it seems not working for synthesis. To fix these things we add check_filling_factors routine, which assures that the sum of ff's in every layer with subshells amounts to 1. If not, it assumes isocontribution: ff=1/natms with nsub the number of sub-atmospheres in the layer (e.g.,if nsub=2, ff=0.5).
… every transition: 1) New procedure for introducing boundary conditions with spectral dependence. Introduction of i0fraction keyword as the fraction of the true background continuum intensity to the Allen continuum. i0fraction is included just in case we want to consider it as inversion parameter in the future, but in general it shall be avoided (is set to 1.0 by default). When defining boundary as the ratio between the true physical background Stokes profiles and the I0Allen, i0fraction is already the first spectral value of the boundary intensity condition, being 1 to represent i0Allen or lower for different continuum backgrounds. 2)Now the boundary keyword accept either 4 scalars that will be broadcasted as constants in wavelength for every Stokes, or directly 4 spectral profiles. These changes shall allow to control the boundary spectral dependence and the relative level of continuum intensity that shall be used to normalize, allowing treating solar locations where I0Allen does not fit the profiles. In addition to the changes in model.py, spectrum.py, and chromospehre.py to simplify and generalize the set up of the boundary condition, I also added the check_key routine to simplify the setup of default keyword values in the (deprecated) add_spectral subroutine. 3)I changed the number of dimensions of j10 , nbar, and omega to ntrans, thus avoiding a hardcoded value (it was hardcoded to 4). The flow is :read fortran atom%ntrans from atom file (from io_f90.py) -> init() routine in hazel_py.f90 -> init() in hazel_code.pyx -> call hazel_code._init in model.py,where self.ntrans is set and later passed to set_chromosphere (from main python program or from file with use_configuration), which passes it via input parameter to initialization of Hazel chromosphere object, where dimensions of j10,omega and nbar are set to ntrans. I renamed omega as j20f, because is a modulatory factor that shall multiply the j20 anisotropy. Now, the introduction of j10 and j20f can be done via keyword from set_pars either providing a single number that shall be broadcasted to all transitions or providing a list with specific values for each transition.
…n as did with j10. Change keywords working_mode and atomf to mode and atomfile in call to hazel.Model. Improve and shorten the initialization of model object. Add precision keyword to print_parameters. Pass the parameters atompol,magopt,stimem, and nocoh (read in hazel.Model with apmosenc)to synthazel through normal arguments when calling Hazel_atmosphere from model/add_chromosphere.To do this I implemented two redudant ways, a compact one through the keyword apmosenc(=Atom.Pol+Magn.Optical+Stim.Emiss.+NoCoherences)='1110' by default and also introducing those pars with a dictionary for more verbosity and clarity. Add depolaring collisions parameters dcol=[double,double,double] to control total elastic depolarizing collisions(for K=1 and for K=2 together), depolarizing collisions (D^1_Q) only for multipoles K=1 and only for multipoles K=2(D^2_Q). Implement the action of all above parameters in SEE and optical coeffs. Delete some old general fortran variables and routines that are not used anymore. Select the Hazel synthesis method, passing it to Hazel via add_spectrum (for all atmospheres) and also redundantly via model.synthesize() for each atmosphere individually if required. Addition of Emissivity method and of the verification logic for reading and selecting different methods.
…del.mutates() assumes that chromospheres, magnetic field reference frame, and topologies are already added and fixed, but accepts modifications (mutations) of some input parameters of the previous experiment.The possible pars to mutate are: 1)model pars: apmosekc,dcol 2)chromosphere pars: B1,B2,B3,tau,v delta,beta,a,ff(check),j10,j20f,nbar. Future mutable parameters are :i0Allen, hz, LOS,boundary,synmethod. Added improved plotting capabilities(optical coefficients,fractional polarization,comparison of mutations,axes labels,using and returning automatic figures and axes). The plotting code is encapsulated in four main routines: Model.synthesize(),Model.mutates(),Model.plot_coeffs(), and Model.fractional_polarization(). I could not add the capacity of plotting the windows in different positions because I cannot change my backend to anything different than MacOs.
…e transfer methods with accuracy and some realism, the following subroutines have been created: 1) add_funcatmos(), set_funcatmos() and PolyFx() to easily and succintely add and set parametric optically-thick atmospheres made of N Hazel (cell) atmospheres with physical quantities varying as different analytical functions (polynomials,exponential,random,...). 2) plot_funcatmos() and plot_PolyFx() to plot the variations with height of all atmospheric parameters and compare them with polynomials of different order. 3) fix_point_polyfit_fx() to create a polynomial fitting certain points but passing exactly through some control points (typically the boundaries). 4) fun_minT() to easily create accurate parametric variations of deltav mimicking a minimum of temperature. 5)get_Tpars() to obtain transformations between temperature, doppler broadening, and deltav. Adding here a new dictionary of atomic weights in model.py. 6) check_B_vals() to check that the magnetic field values are in the correct limits for every coordinate system.Add new dictionary of limiting values for each physical quantity. 7)Allow the height axis to be non-linear to check the effects of sampling some optical depths more than others.
Feat-Perf-Fix-Style: 1)Mutates() change any parameter not only at given layers, but through the whole atmosphere at once using set_funcatmos(). 2)Plotting procedures able to plot and reshow figures after closing them accidentally and avoid ghost figures after mutating experiments. New functions reshow(), remove_fig(), setup_set_figure(). Labelling convention: '1' stokes profiles. '2' optical coefficients. '3' comparing mutated stokes profiles. '4' showing depth variations of atmosphere parameters. 3)Easily compare adn of two experiments layer by layer with compare_experiments(). 4)New check_limits(): Check that limiting values of the input parameters are in physical ranges specified as part of the model. 5)Set_funcatm() adds check_limits() and the possibility to be called from mutates. 6)get_B_Hazel() and cartesian_to_spherical() can now work with entire arrays and not just with single layers. 7)New just_B_Hazel() transforms to Hazel magnetic/LOS reference frames more efficiently and returning only the components used by set_funcatms. 8)New reset_pars() sets the Hazel parameters for all cells in the atmosphere as specified from given interval values. 9)New class ntr_array allow store and access the array-like variables j10,j20f,nbar when number of transitions are more than one and make add_pars() and reset_pars() shorter and more efficient. 10)Model DNA now stores the whole arrays of j10,j20f, and nbar.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
First milestone: allow working with Carlin 2019 radiative transfer model.
Firsts ten commits of the first milestone.
These ten commits:
Disentangling add_spectral and add_chromosphere and inversion of their order when calling them.
Moving spectral keywords from add_(atmospheres) to add_spectral.
Redefinition of add_chromosphere and add_spectrum/add_spectral to work with keywords instead of with input dictionaries.
Redefinition of several keyword and subroutine names to be shorter (maintaining the possibility of using previous notation).
Improvements when reading and processing keywords.
Definition of routines for changing magnetic field reference frame and simplification of corresponding magnetic field treatment in add_parameters and in synthesize of chromosphere.py.
Addition of some plotting routines as part of the model object.
I have tried to make the changes compatible with use of SIR, of a filling factor, and with the inversions. However these three capabilities have not been tested properly after these commits.