1 |
Instructions for running the "beaufort" 40x40x50 configuration |
2 |
face=6; ix=101:300; jx=290:449; kx=1:50; |
3 |
|
4 |
1. Obtain copies of following directories: |
5 |
(Note the forcing files span many years. To save time and space |
6 |
you only need to download the years of interest from the |
7 |
ncep_rgau and cube78_forcing directories) |
8 |
ftp://ecco2.jpl.nasa.gov/data1/beaufort/code |
9 |
ftp://ecco2.jpl.nasa.gov/data1/beaufort/run_template |
10 |
ftp://ecco2.jpl.nasa.gov/data1/data/ncep/ncep_rgau |
11 |
ftp://ecco2.jpl.nasa.gov/data1/data/blend_forcing/cube78_forcing |
12 |
|
13 |
2. Set up cvs server |
14 |
bash or sh shell: |
15 |
export CVSROOT=':pserver:cvsanon@mitgcm.org:/u/gcmpack' |
16 |
tcsh or csh shell: |
17 |
setenv CVSROOT ':pserver:cvsanon@mitgcm.org:/u/gcmpack' |
18 |
|
19 |
3. Get code from cvs server |
20 |
cvs login |
21 |
( enter the CVS password: "cvsanon" ) |
22 |
cvs co MITgcm_code |
23 |
|
24 |
4. Get the ice code, PM2 |
25 |
Obtain directories PM2/F95 and PM2/config from ...?? |
26 |
call the directory containting PM2, ICE_DIR, |
27 |
in my case ICE_DIR=/dm5/bep/sulsky/seaice on pollux |
28 |
ICE_DIR=/workg/bep/sulsky/seacie on gemini |
29 |
|
30 |
The PM2/config/hosts files for gemini and pollux assume petsc is installed in |
31 |
PETSCDIR = /dm5/bep/sulsky/Packages/petsc-2.3.3-p8 on pollux |
32 |
PETSCDIR = /workg/bep/sulsky/Packages/petsc-2.3.3-p8 on gemini |
33 |
If petsc is installed elsewhere, then change the directory specification. |
34 |
(The configure command to build petsc on gemini or pollux is |
35 |
./config/configure.py --with-debug=1 --with-fc=/opt/intel/fc/9.1.051/bin/ifort |
36 |
--with-f90=/opt/intel/fc/9.1.051/bin/ifort --with-mpi-dir=/opt/mpich/ch-p4 |
37 |
--with-blas-lapack-dir=/opt/intel/mkl/9.1.023/lib/64 for debuggable code and |
38 |
|
39 |
./config/configure.py --with-debug=0 --with-fc=/opt/intel/fc/9.1.051/bin/ifort |
40 |
--with-f90=/opt/intel/fc/9.1.051/bin/ifort --with-mpi-dir=/opt/mpich/ch-p4 |
41 |
--with-blas-lapack-dir=/opt/intel/mkl/9.1.023/lib/64 for optimized code.) |
42 |
|
43 |
Input files go in MITgcm/ice (pm2input, pm2geometry) After compile step, run |
44 |
PM2-Pre once to generate pm2_grid.nc pm2_part.nc with initial geometry. |
45 |
|
46 |
============================================= |
47 |
Running on a linux workstation: |
48 |
|
49 |
5. Compile code: |
50 |
cd MITgcm |
51 |
mkdir bin exe |
52 |
cd bin |
53 |
../tools/genmake2 -mods=../../code |
54 |
make depend |
55 |
make -j |
56 |
|
57 |
6. Model execution: |
58 |
cd ../exe |
59 |
cp ../../run_template/* . |
60 |
cp ../bin/mitgcmuv . |
61 |
./mitgcmuv >& output.txt & |
62 |
|
63 |
============================================= |
64 |
Running on gemini. |
65 |
|
66 |
5. Compile code: |
67 |
cd MITgcm |
68 |
mkdir bin exe |
69 |
cd bin |
70 |
\cp ../../code/* . |
71 |
\mv SIZE.h_2 SIZE.h |
72 |
../tools/genmake2 -of ../tools/build_options/linux_ia64_ifort+mpi_altix_jpl |
73 |
make depend |
74 |
make -j |
75 |
|
76 |
6. Model execution: |
77 |
cd ../exe |
78 |
cp ../../run_template/* . |
79 |
cp ../bin/mitgcmuv . |
80 |
bsub < jobfile |
81 |
bjobs |
82 |
|
83 |
============================================= |
84 |
Running MITgcm and MPMice on gemini. |
85 |
|
86 |
5. Compile code: |
87 |
cd MITgcm |
88 |
mkdir bin |
89 |
cd bin |
90 |
\rm * |
91 |
\cp ../../code/* . |
92 |
\mv SIZE.h_2 SIZE.h |
93 |
\mv CPP_EEOPTIONS.h_CPL CPP_EEOPTIONS.h |
94 |
../tools/genmake2 -of ../tools/build_options/linux_ia64_ifort+mpi_altix_jpl |
95 |
make depend |
96 |
make -j |
97 |
|
98 |
cd ICE_DIR |
99 |
make |
100 |
|
101 |
ONCE ONLY: (only need to redo if geometry changes) |
102 |
cd MITgcm/ice |
103 |
mpirun -np 1 ICE_DIR/PM2-Pre |
104 |
|
105 |
cd MITgcm |
106 |
|
107 |
6. Model execution: |
108 |
cd .. |
109 |
\rm out err |
110 |
mkdir ocean |
111 |
cd ocean |
112 |
\rm * |
113 |
cp ../../run_template/* . |
114 |
cd .. |
115 |
cp -r ../ice . |
116 |
\cp bin/mitgcmuv . |
117 |
\cp ice/PM2 . |
118 |
bsub < ocean/jobfile2 |
119 |
bjobs |