The following MT software are available on the NCI platforms under NCI project up99.
Enabling the geophysics software in your account
In order to use the Magnetotellurics (MT) software on NCI, first login to an NCI platform (Gadi or via ARE VDI app). Note that the ARE VDI superseded the OOD VDI as a platform in April 2023.
For Gadi it will look like:
$ ssh abc123@gadi.nci.org.au
where abc123 is your NCI username.
Now use a text editor (e.g. nano, emacs, vim) to edit your .bashrc:
$ nano ~/.bashrc
Add the following line to the bottom of your .bashrc:
export MODULEPATH=/g/data/up99/modulefiles:$MODULEPATH
Save and exit.
Run the following command in your terminal:
$ source ~/.bashrc
You should now be able to load the following MT modules:
$ module load birrp/5.3.2_hlmt { or if using on the OOD VDI app: $ module load birrp/5.3.2_hlmt_vdi } $ module load egbert_EMTF $ module load occam1DCSEM $ module load dipole1D $ module load occam2D $ module load modem/04.2018 { or if using on the OOD VDI app: $ module load modem/04.2018.VDI } $ module load mare2DEM/2021.v5.1 { or if using on the OOD VDI app: $ module load mare2DEM/2021.v5.1.VDI } $ module load femtic/4.1 $ module load mtpy/2021.11
BIRRP
The Bounded Influence Remote Reference Processing (BIRRP) program computes magnetotelluric and geomagnetic depth sounding response functions using a bounded influence, remote reference method, along with an implementation of the jackknife to get error estimates on the result. BIRRP is based on the following studies:
- Chave, A.D., and D.J. Thomson, A bounded influence regression estimator based on the statistics of the hat matrix, J. Roy. Stat. Soc., Series C, (Appl. Statist.), 52, 307-322, 2003, https://doi.org/10.1111/1467-9876.00406
- Chave, A.D., and D.J. Thomson, Bounded influence estimation of magnetotelluric response functions, Geophys. J. Int., 157, 988-1006, 2004, https://doi.org/10.1111/j.1365-246X.2004.02203.x
- Chave, A.D., D.J. Thomson, and M.E. Ander, On the robust estimation of power spectra, coherences, and transfer functions, J. Geophys. Res., 92, 633-648, 1987, https://doi.org/10.1029/JB092iB01p00633
- Chave, A.D., and D.J. Thomson, Some comments on magnetotelluric response function estimation, J. Geophys. Res., 94, 14215-14225, 1989, https://doi.org/10.1029/JB094iB10p14215
- D.J. Thomson and A.D. Chave, Jackknife error estimates for spectra, coherences, and transfer functions, in S. Haykin (ed.), Advances in Spectral Analysis and Array Processing, Englewood Cliffs: Prentice-Hall, pp. 58-113, 1991.
- Chave, A.D., 1989. BIRRP: Bounded influence, remote reference processing. Journal of Geophysical Research, 94(B10), pp.14-215.
The BIRRP documentation can be found here.
For more information on BIRRP and the conditions of use, please refer to the BIRRP webpage.
To use BIRRP on Gadi, first load in the appropriate BIRRP module:
$ module use /g/data/up99/modulefiles $ module load birrp/5.3.2_hlmt
and then run:
$ birrp-5.3.2v2
To test that BIRRP is working:
$ cp /g/data/up99/sandbox/birrp_test/merged/CP1.script <directory that you have write access to> $ cd <directory you copied CP1.script to> $ birrp-5.3.2v2 < CP1.script
After running these command, you should see fft.*, *.rf, *.rp, *.2c2, *.diag, *.cov, *.j outputs.
To use BIRRP on the ARE VDI app, you will need to load in the following module:
$ module use /g/data/up99/modulefiles $ module load birrp/5.3.2_hlmt.OOD_v1.vdi
EMTF
EMTF is the Oregon State University (OSU) robust single station, remote reference and multiple station MT timeseries data processing program which is based on:
Egbert, G.D., 1997. Robust multiple-station magnetotelluric data processing. Geophysical Journal International, 130(2), pp.475-496, https://doi.org/10.1111/j.1365-246X.1997.tb05663.x
The EMTF source code, documentation and data for compiling and testing EMTF can be found here.
To run EMTF, first load in the EMTF module:
$ module use /g/data/up99/modulefiles $ module load egbert_EMTF
Next you can try one of the following:
$ bin2asc $ clean $ dnff $ multmtrn <array.cfg> $ rfasc $ rfemi $ tranmtlr
OCCAM1DCSEM / DIPLOE1D
OCCAM1DCSEM is a Fortran package for generating smooth one-dimensional models from controlled-source electromagnetic and magnetotelluric data. This code was developed under the sponsorship of the Seafloor Electromagnetic Methods Consortium (SEMC).
The OCCAM1DCSEM and DIPOLE1D source code can be found here.
To use OCCAM1DCSEM and/or DIPOLE1D on Gadi, first load in the appropriate modules:
$ module use /g/data/up99/modulefiles $ module load occam1DCSEM $ module load dipole1D
and then run:
$ OCCAM1DCSEM or $ DIPOLE1D
To test that OCCAM1DCSEM is working:
$ cd <directory that you have write access to> $ cp /g/data/up99/sandbox/occam_test/occam1DCSEM/Canonical_RealImag_BxEyEz/* . $ OCCAM1DCSEM startup
OCCAM2DMT
Occam's inversion for 2D magnetotelluric (MT) modeling is based on the following studies:
Occam's Inversion:
Constable, S. C., R. L. Parker, and C. G. Constable, Occam’s inversion - A practical algorithm for generating smooth models from electromagnetic sounding data, Geophysics, 52 (03), 289–300, 1987, https://doi.org/10.1190/1.1442303
deGroot-Hedlin, C., and S. Constable, Occam’s inversion to generate smooth two-dimensional models from magnetotelluric data, Geophysics, 55 (12), 1613–1624, 1990, https://doi.org/10.1190/1.1442813
2DMT Forward code:
Wannamaker, P. E., J. A. Stodt, and L. Rijo, A stable finite-element solution for two-dimensional magnetotelluric modeling, Geophysical Journal of the Royal Astronomical Society, 88, 277–296, 1987, https://doi.org/10.1111/j.1365-246X.1987.tb01380.x
2DMT Jacobian sensitivity code:
de Lugao, P. P., and P. Wannamaker, Calculating the two-dimensional magnetotelluric Jacobian in finite elements using reciprocity, Geophys. J. Int., 127, 806-810, 1996, https://doi.org/10.1111/j.1365-246X.1996.tb04060.x
The OCCAM2D source code can be found here.
To use OCCAM2D on Gadi, first load in the appropriate modules:
$ module use /g/data/up99/modulefiles $ module load occam2D
and then run:
$ Occam2D
To test Occam2D is working:
$ cd <directory that you have write access to> $ cp /g/data/up99/sandbox/occam_test/occam2d_test/* . $ Occam2D startup
MARE2DEM
MARE2DEM is a parallel adaptive finite element code for 2D forward and inverse modeling for electromagnetic geophysics. Initially developed with funding support from the Scripps Seafloor Electromagnetic Methods Consortium, it is now supported by the Electromagnetic Methods Research Consortium at Columbia University. MARE2DEM works for 2D anisotropic modeling for controlled-source electromagnetic (CSEM), magnetotelluric (MT) and surface-borehole EM applications in onshore, offshore and downhole environments.
For more information, please visit the MARE2DEM user manual.
To use mare2DEM on the ARE VDI app, we need to load the following module:
$ module use /g/data/up99/modulefiles $ module load mare2DEM/2021.v5.1.OOD_v1.VDI
To run mare2DEM in inversion mode:
$ mpirun -np <number of processors> --oversubscribe MARE2DEM <input resistivity file>
To run mare2DEM in forward mode:
$ mpirun -np <number of processors> --oversubscribe MARE2DEM -F <input resistivity file>
To run mare2DEM in forward fields mode:
$ mpirun -np <number of processors> --oversubscribe MARE2DEM -FF <input resistivity file>
To test mare2DEM is working:
$ cd <directory that you have write access to> $ cp /g/data/up99/sandbox/mare2DEM_test/inversion_MT/* . $ mpirun -np 8 --oversubscribe MARE2DEM Demo.0.resistivity
To test mare2DEM is running on Gadi, you will first need to create a PBS script:
#!/bin/bash #PBS -N mare2DEM_test #PBS -q normal #PBS -P <project-code-with-compute> #PBS -l walltime=0:10:00 #PBS -l ncpus=12 #PBS -l mem=10GB #PBS -l jobfs=1GB #PBS -l storage=gdata/<project-code>+gdata/<project-code> module use /g/data/up99/modulefiles module load mare2DEM/2021.v5.1 cd <area-where-you-have-write-access> cp /g/data/up99/sandbox/mare2DEM_test/inversion_MT/* . mpirun -np 12 MARE2DEM Demo.0.resistivity > mare2DEM_test.results
Save this script as <script_name>.sh. Now we can submit this as a PBS job:
$ qsub <script_name>.sh
ModEM
Modular EM (ModEM) is a flexible electromagnetic modelling and inversion program written in Fortran 95. It is currently available to use with 2D and 3D MT problems. For more information about ModEM, please read:
Kelbert, A., Meqbel, N., Egbert, G.D. and Tandon, K., 2014. ModEM: A modular system for inversion of electromagnetic geophysical data. Computers & Geosciences, 66, pp.40-53, https://doi.org/10.1016/j.cageo.2014.01.010Get
Egbert, G.D. and Kelbert, A., 2012. Computational recipes for electromagnetic inverse problems. Geophysical Journal International, 189(1), pp.251-267,
https://doi.org/10.1111/j.1365-246X.2011.05347.x
To learn more about ModEM, please visit the ModEM-geophysics website.
To use ModEM on Gadi, first load in the appropriate module:
$ module use /g/data/up99/modulefiles $ module load modem/04.2018
and then run:
$ mod2DMT or $ mod3DMT
Note that if you need to use the NCI modified Mod3DMT_MPI parallel code, you will have to join the group ModEM-geophys via the NCI Mancini page and abide by the conditions of use.
Once you have access to this group, you can load in the following modules on Gadi:
$ module load ModEM-geophysics/2013.06 or $ module load ModEM-geophysics/2015.01
Running a ModEM 3D inversion
Let's test the serial and parallel Mod3DMT codes on both the ARE VDI app and Gadi for the ObliqueOne inversion dataset given in the examples folder of the ModEM software.
Test 1: Serial Mod3dMT on the ARE VDI app
$ module load modem/04.2018.VDI $ cd <directory-you-have-write-access-to> $ cp /g/data/up99/sandbox/modem_test/ObliqueOne/INV/* . $ mod3DMT -I NLCG 50_2x2_for_ObliqueOne.model Z_10_5_16p_100st_ObliqueOne_Noise50.data Inv_para.dat FWD_para.dat x02_y02_z02.cov
Test 2: Parallel Mod3dMT_MPI on Gadi
To run Mod3dMT_MPI, we need to create a PBS job script. For this example, let’s use the normal cue, 48 cpus, 12GB memory and a walltime of 10 minutes:
#!/bin/bash #PBS -N mod3DMT_test #PBS -q normal #PBS -P <project-code-with-compute> #PBS -l walltime=0:10:00 #PBS -l ncpus=48 #PBS -l mem=96GB #PBS -l jobfs=1GB #PBS -l storage=gdata/<project-code>+gdata/<project-code> module load ModEM-geophysics/2013.06 cd <directory-you-have-write-access-to> cp /g/data/up99/sandbox/modem_test/ObliqueOne/INV/* . mpirun -np 48 Mod3DMT_MPI -I NLCG 50_2x2_for_ObliqueOne.model Z_10_5_16p_100st_ObliqueOne_Noise50.data Inv_para.dat FWD_para.dat x02_y02_z02.cov > mod3dmt_oblique_test.out
To run our script:
$ qsub <script_name>.sh
FEMTIC
The Finite Element MagnetoTelluric Inversion Code (FEMTIC) is a 3-D magnetotelluric inversion code based on the following studies:
- Yoshiya Usui, 3-D inversion of magnetotelluric data using unstructured tetrahedral elements: applicability to data affected by topography, Geophysical Journal International, Volume 202, Issue 2, August 2015, Pages 828–849, https://doi.org/10.1093/gji/ggv186
- Yoshiya Usui, Yasuo Ogawa, Koki Aizawa, Wataru Kanda, Takeshi Hashimoto, Takao Koyama, Yusuke Yamaya, Tsuneomi Kagiyama, Three-dimensional resistivity structure of Asama Volcano revealed by data-space magnetotelluric inversion using unstructured tetrahedral elements, Geophysical Journal International, Volume 208, Issue 3, March 2017, Pages 1359–1372, https://doi.org/10.1093/gji/ggw459
- Yoshiya Usui, Takafumi Kasaya, Yasuo Ogawa, Hisanori Iwamoto, Marine magnetotelluric inversion with an unstructured tetrahedral mesh, Geophysical Journal International, Volume 214, Issue 2, August 2018, Pages 952–974, https://doi.org/10.1093/gji/ggy171
- Yoshiya Usui, Applicability evaluation of non-conforming deformed hexahedral mesh for marine magnetotellurics, Japan Geoscience Union Meeting. 2021.
For more information on FEMTIC, please refer to the FEMTIC home page.
To use FEMTIC, first load in the appropriate module:
$ module use /g/data/up99/modulefiles $ module load femtic/4.2
Sample input files for FEMTIC are available from the FEMTIC GitHub repository. To run these example inversions, you would need to create a PBS job script:
#!/bin/bash #PBS -P <compute_project_code> #PBS -q normal #PBS -l ncpus=48,walltime=0:05:00,mem=20GB,jobfs=1GB #PBS -l storage=scratch/<project_code>+gdata/up99+gdata/<project_code> module use /g/data/up99/modulefiles module load femtic/4.2 cd <directory_with_FEMTIC_dat_files> mpirun -np 48 femtic
Note that the number of threads to be used is contained in the “control.dat” file under "NUM_THREADS" and this can be changed according to your use case.
To run this script:
$ qsub <script_name>.sh
For memory intensive jobs, you may need to adapt the number of MPI processes and threads accordingly:
#!/bin/bash #PBS -P <compute_project_code> #PBS -q normalsr #PBS -l ncpus=208,walltime=0:30:00,mem=1000GB,jobfs=100GB #PBS -l storage=scratch/<project_code>+gdata/up99+gdata/<project_code> module use /g/data/up99/modulefiles module load femtic/4.2 cd <directory_with_FEMTIC_dat_files> HOSTFILE="my_intelmpi_hostfile" uniq < $PBS_NODEFILE > $HOSTFILE mpirun -np 26 -ppn 13 -genv I_MPI_EXTRA_FILESYSTEM_FORCE=lustre -genv I_MPI_DEBUG=5 -hostfile $HOSTFILE femtic >& output.log
The example above requests 2 Sapphire Rapid nodes (104 CPUs per node) and 1000GB of memory. Running 26 MPI processes allows for ~38.5 GB of memory per MPI process, and these processes are split over the 2 requested nodes ( -ppn 13). By setting the "NUM_THREADS" in the "control.dat" file to 8, we can utilise all requested CPUs (8 x 13 = 104 CPUs per node).
MTpy
MTpy is a Python toolbox for magnetotelluric (MT) data processing, analysis, modelling and visualisation. For more information, please visit the MTpy github repository.
The standalone module for MTpy was removed from up99 on 29-Nov-2022.
MTpy is now managed under the NCI-geophysics module.