-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Get changes from mpcdf_raven development branch (#15)
* first edits towards Kipf and Welling GCN. * addition to ptable plot now also plots number of compounds that LDAU correction has been applied * add gpu test scripts * work on gcn * fix node update function it is only a simple transformation of the nodes in context of the GCN * dividing different modules into own files the models module is now more structured. TODO tests * fix circular imports in new modules * changed utilities name to mlp * add loading module interfaces train/inference with different models * updated configs * run new grid points for egap model * started rest ef grid and cleaned up crossval plot * improve crossval boxplots * new runs in job_scripts * update of crossval scripts and mcd plotting * added new script for ensemble predictions * improved plots for ensemble_error * add extra printing to classify.py * improved error_analysis regression plots * add printing of dataset STDEV for plotting scripts * work on error calibration plots * adjustments to crossval plotting * change some visuals in plotting scripts * small visual improvements to plots * cleanup * calculate pearson correlation for UQ * updated requirements * updated plots and added seaborn as requirement: -larger default font and tick sizes -added histogram of errors in error analysis -added printing pearson correlation to mc_error * job script: go back to standard pbj training * small changes to axis labels and legend * implement schnet model as copy of mpeu without edge updates * make methods in model modules protected also includes making megnet model and gcn model independent of mpeu * start work on test for GCN * finish test for gcn node updates * update data pulling for all aflow ef's * fix dataframe append in plot crossval * change logging of data conversion and print validation errors in error analysis * increase tick size in crossval plots * cleaned up error_analysis additional flag allows to choose single plots * improvements on data conversion preparation for running on big aflow dataset * added batch size to training evaluater * fix inference function with using batch size from config and update evaluation job script * updated most tests * Deleted evaluation.py and inference_file function added test for get_predictions * udpate configs to include model name string * added schnet to model loading and added configs for schnet ef and egap models * fixed naming of layers in schnet, fixes compatibility with models trained in the old schnet branch --------- Co-authored-by: dts <[email protected]>
- Loading branch information
Showing
53 changed files
with
2,282 additions
and
680 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,7 @@ | ||
#!/mpcdf/soft/SLE_15/packages/x86_64/anaconda/3/2020.02/bin/python3.7 | ||
|
||
import jax | ||
|
||
devices = jax.local_devices() | ||
|
||
print(devices) |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,26 @@ | ||
#!/bin/bash -l | ||
# Standard output and error: | ||
#SBATCH -o ./output_slurm/singlejob.%j.out | ||
#SBATCH -e ./output_slurm/singlejob.%j.err | ||
# Initial working directory: | ||
#SBATCH -D ./ | ||
# Job name | ||
#SBATCH -J egap_pbj | ||
# | ||
#SBATCH --nodes=1 # Request 1 or more full nodes | ||
#SBATCH --constraint="gpu" # Request a GPU node | ||
#SBATCH --gres=gpu:a100:4 # Use all a100 GPU on a node | ||
#SBATCH --cpus-per-task=10 | ||
#SBATCH --ntasks-per-core=1 | ||
#SBATCH --mem=32000 # Request 32 GB of main memory per node in MB units. | ||
#SBATCH --mail-type=none | ||
#SBATCH [email protected] | ||
#SBATCH --time=12:00:00 | ||
|
||
# load the environment with modules and python packages | ||
cd ~/envs ; source ~/envs/activate_jax.sh | ||
cd ~/jraph_MPEU | ||
|
||
export OMP_NUM_THREADS=${SLURM_CPUS_PER_TASK} | ||
|
||
srun python jax_GPU.py >> text.out |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,7 +1,7 @@ | ||
#!/bin/bash -l | ||
# Standard output and error: | ||
#SBATCH -o ./output_slurm/datajob.out.%j | ||
#SBATCH -e ./output_slurm/datajob.err.%j | ||
#SBATCH -o ./output_slurm/datajob.%j.out | ||
#SBATCH -e ./output_slurm/datajob.%j.err | ||
# Initial working directory: | ||
#SBATCH -D ./ | ||
# Job name | ||
|
@@ -10,10 +10,10 @@ | |
#SBATCH --nodes=1 # Request 1 or more full nodes | ||
#SBATCH --cpus-per-task=10 | ||
#SBATCH --ntasks-per-core=1 | ||
#SBATCH --mem=32000 # Request 32 GB of main memory per node in MB units. | ||
#SBATCH --mem=16000 # Request main memory per node in MB units. | ||
#SBATCH --mail-type=none | ||
#SBATCH [email protected] | ||
#SBATCH --time=1:00:00 # 1 hour | ||
#SBATCH --time=24:00:00 # time limit in hours | ||
|
||
# load the environment with modules and python packages | ||
cd ~/envs ; source ~/envs/activate_jax.sh | ||
|
@@ -22,5 +22,5 @@ cd ~/jraph_MPEU | |
export OMP_NUM_THREADS=${SLURM_CPUS_PER_TASK} | ||
|
||
srun python scripts/data/aflow_to_graphs.py \ | ||
-f aflow/egaps_eform_all.csv -o aflow/graphs_all_24knn.db -cutoff_type knearest \ | ||
-cutoff 24.0 | ||
--file_in=aflow/eform_all.csv --file_out=aflow/eform_all_graphs_1.db --cutoff_type=knearest \ | ||
--cutoff=12.0 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,29 @@ | ||
#!/bin/bash -l | ||
# specify the indexes (max. 30000) of the job array elements (max. 300 - the default job submit limit per user) | ||
#SBATCH --array=183,17,85,236,231,53,458,470,140,409 | ||
# Standard output and error: | ||
#SBATCH -o ./output_slurm/eval_%A_%a.out | ||
#SBATCH -e ./output_slurm/eval_%A_%a.err | ||
# Initial working directory: | ||
#SBATCH -D ./ | ||
# Job name | ||
#SBATCH -J eval_batch | ||
# | ||
#SBATCH --nodes=1 # Request 1 or more full nodes | ||
#SBATCH --constraint="gpu" # Request a GPU node | ||
#SBATCH --gres=gpu:a100:1 # Use one a100 GPU | ||
#SBATCH --cpus-per-task=10 | ||
#SBATCH --ntasks-per-core=1 | ||
#SBATCH --mem=32000 # Request 32 GB of main memory per node in MB units. | ||
#SBATCH --mail-type=none | ||
#SBATCH [email protected] | ||
#SBATCH --time=12:00:00 # 12h should be enough for any configuration | ||
|
||
# load the environment with modules and python packages | ||
cd ~/envs ; source ~/envs/activate_jax.sh | ||
cd ~/jraph_MPEU | ||
|
||
export OMP_NUM_THREADS=${SLURM_CPUS_PER_TASK} | ||
|
||
srun python scripts/plotting/error_analysis.py \ | ||
--file=./results/aflow/ef_rand_search/id${SLURM_ARRAY_TASK_ID} \ |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,27 @@ | ||
#!/bin/bash -l | ||
# Standard output and error: | ||
#SBATCH -o ./output_slurm/evaljob.%j.out | ||
#SBATCH -e ./output_slurm/evaljob.%j.err | ||
# Initial working directory: | ||
#SBATCH -D ./ | ||
# Job name | ||
#SBATCH -J eval | ||
# | ||
#SBATCH --nodes=1 # Request 1 or more full nodes | ||
#SBATCH --constraint="gpu" # Request a GPU node | ||
#SBATCH --gres=gpu:a100:1 # Use one a100 GPU | ||
#SBATCH --cpus-per-task=10 | ||
#SBATCH --ntasks-per-core=1 | ||
#SBATCH --mem=32000 # Request 32 GB of main memory per node in MB units. | ||
#SBATCH --mail-type=none | ||
#SBATCH [email protected] | ||
#SBATCH --time=12:00:00 | ||
|
||
# load the environment with modules and python packages | ||
cd ~/envs ; source ~/envs/activate_jax.sh | ||
cd ~/jraph_MPEU | ||
|
||
export OMP_NUM_THREADS=${SLURM_CPUS_PER_TASK} | ||
|
||
srun python scripts/plotting/error_analysis.py \ | ||
--file=results/aflow/ef_full_data/ --label=ef --plot=nothing |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Empty file.
Oops, something went wrong.