Instructions for running the "beaufort" 40x40x50 configuration face=6; ix=101:300; jx=290:449; kx=1:50; 1. Set up and log in to cvs server bash or sh shell: export CVSROOT=':pserver:cvsanon@mitgcm.org:/u/gcmpack' tcsh or csh shell: setenv CVSROOT ':pserver:cvsanon@mitgcm.org:/u/gcmpack' cvs login ( enter the CVS password: "cvsanon" ) 2. Get code, input, and README files from CVS server cvs co -d beaufort MITgcm_contrib/MPMice/beaufort cd beaufort cvs co MITgcm_code 3. Obtain copies of following directories and put them in beaufort ftp://ecco2.jpl.nasa.gov/data1/beaufort/run_template ftp://ecco2.jpl.nasa.gov/data1/data/blend_forcing/cube78_forcing (Note the forcing files in cube78_forcing span many years. To save time, you only need to download the years of interest, *92 for example below, plus the runoff-360x180x12.bin file) 4. Get the ice code, PM2 Obtain directories PM2/F95 and PM2/config from ...?? call the directory containting PM2, ICE_DIR, in my case ICE_DIR=/dm5/bep/sulsky/seaice on pollux ICE_DIR=/workg/bep/sulsky/seacie on gemini The PM2/config/hosts files for gemini and pollux assume petsc is installed in PETSCDIR = /dm5/bep/sulsky/Packages/petsc-2.3.3-p8 on pollux PETSCDIR = /workg/bep/sulsky/Packages/petsc-2.3.3-p8 on gemini If petsc is installed elsewhere, then change the directory specification. (The configure command to build petsc on gemini or pollux is ./config/configure.py --with-debug=1 --with-fc=/opt/intel/fc/9.1.051/bin/ifort --with-f90=/opt/intel/fc/9.1.051/bin/ifort --with-mpi-dir=/opt/mpich/ch-p4 --with-blas-lapack-dir=/opt/intel/mkl/9.1.023/lib/64 for debuggable code and ./config/configure.py --with-debug=0 --with-fc=/opt/intel/fc/9.1.051/bin/ifort --with-f90=/opt/intel/fc/9.1.051/bin/ifort --with-mpi-dir=/opt/mpich/ch-p4 --with-blas-lapack-dir=/opt/intel/mkl/9.1.023/lib/64 for optimized code.) Input files go in MITgcm/ice (pm2input, pm2geometry) After compile step, run PM2-Pre once to generate pm2_grid.nc pm2_part.nc with initial geometry. ============================================= Running on a linux workstation: 5. Compile code: cd MITgcm mkdir bin exe cd bin ../tools/genmake2 -mods=../../code make depend make -j 6. Model execution: cd ../exe cp ../../run_template/* . cp ../../input/* . cp ../bin/mitgcmuv . ./mitgcmuv >& output.txt & ============================================= Running on gemini. 5. Compile code: cd MITgcm mkdir bin exe cd bin \cp ../../code/* . \mv SIZE.h_2 SIZE.h ../tools/genmake2 -of ../tools/build_options/linux_ia64_ifort+mpi_altix_jpl make depend make -j 6. Model execution: cd ../exe cp ../../run_template/* . cp ../bin/mitgcmuv . bsub < jobfile bjobs ============================================= Running MITgcm and MPMice on gemini. 5. Compile code: cd MITgcm mkdir bin cd bin \rm * \cp ../../code/* . \mv SIZE.h_2 SIZE.h \mv CPP_EEOPTIONS.h_CPL CPP_EEOPTIONS.h ../tools/genmake2 -of ../tools/build_options/linux_ia64_ifort+mpi_altix_jpl make depend make -j cd ICE_DIR make ONCE ONLY: (only need to redo if geometry changes) cd MITgcm/ice mpirun -np 1 ICE_DIR/PM2-Pre cd MITgcm 6. Model execution: cd .. \rm out err mkdir ocean cd ocean \rm * cp ../../run_template/* . cd .. cp -r ../ice . \cp bin/mitgcmuv . \cp ice/PM2 . bsub < ocean/jobfile2 bjobs