The Convective Storms Group Guide to The

The Convective Storms Group Guide to The

The Convective Storms Group Guide to the

Weather Research and Forecasting (WRF) Model

Created by the NCSU Convective Storms Group:

Dr. Matthew Parker

Jerilyn Billings

Adam French

Last Updated: 6 August 2006

This Guide was modified from The Forecasting Lab’s Guide to the

Weather Research and Forecasting (WRF) Model

Found at:

Created by the NCSU Forecasting Lab:

Dr. Mike Brennan

Megan Gentry

Nicole Haglund

Kevin Hill

Dr. Gary Lackmann

Kelly Mahoney

With additional contributions from:

Matt Borkowski

Zach Brown

Dr. Matt Parker

Last Updated: 27 July 2006

1. Coping files for WRF and WRFSIon the Meso Servers

You will need to copy the wrf and wrfsi directories from other locations

Ideal Case:

If you are running an ideal case use the following file

cd data

cp /usr/local/Install_Files/WRFV2.1.2.TAR.gz .

gunzip WRFV2.1.2.TAR.gz

tar –xvf WRFV2.1.2.TAR

This will create a WRFV2 directory

Real Case: (use this version for now)

If you are running a real case use the following

cd data

cp /data/chnablu/WRFV2.1.1.TAR .

tar –xvf WRFV2.1.1.TAR

This will create the WRFV2 directory ( I suggest changing theit’s name, for example, real_WRFV2.1.1)

2. Compiling WRF and installing WRFSI[1]

1. Compiling WRF on Meso Servers

  1. To compile WRF and WRFSI it has to point to the netcdf libraries.

So, in your .bashrc file you need to add the following line:

export NETCDF=/usr/local/netcdf

export MPICH=/usr/local/mpich

b. Configure WRF

In the WRFV2 directory type:

./configure

You will then choose from the options shown about how you want to compile WRF

You’ll want to chose option 3 (MPICH, RSL_LITE) for nested runs

This creates a file called configure.wrf which contains all the default compilation options for the platform you are working on.

c. Compile WRF

If you type

./compile

you will see a list of choices for compiling the generic WRF model as well as options for several test cases.

To compile the model for a test case (ideal) type:

(Remember use version 2.1.2)

compile testname > messages

i.e. testname = em_quarter_ss

If the model compiles successfully, the files

wrf.exe

ideal.exe

will appear in the /main directory.

To compile WRF for a case with real data type

(Remember use version 2.1.1)

./compile em_real > messages

If this compiles successfully, the files

ndown.exe

real.exe

wrf.exe

will appear in the /main directory

To clean directories of all object files and executables (which you will need to do if your first compiling attempt didn’t work and you’re doing it again!), type

./clean -a

removes all built files, including configure.wrf

2. Compiling WRF and WRFSI on Bigdog

If you have an aliases.csh file on Bigdog you should not need to set any netcdf variables. Use the following aliases.csh if you don’t have one

cp /home/jmbilli2/aliases793.csh /home/yourid

You will also need to copy the WRFV2.1.1 and wrfsiv2.1.2 to your /share directory

cp $group793/wrf/WRF2.1.1.TAR.gz .

Untar and unzip the WRFV2 software

You’ll need to configure and compile this software on Bigdog, Similar process to that on meso servers.

In the WRFV2 directory type:

configure

You will then choose from the options shown about how you want to compile WRF (single processor (1), MPI (3), etc.).

You’ll want to choose option 5

Next compile

compile

you will see a list of choices for compiling the generic WRF model as well as options for several test cases.

To compile the model for a test case type:

compile <testname>

If the model compiles successfully, the files

wrf.exe

ideal.exe

will appear in the /main directory.

To compile WRF for a case with real data type

compile em_real

If this compiles successfully, the files

ndown.exe

real.exe

wrf.exe

will appear in the /main directory

Finally you will copy the WRFSI software.

cd into the WRFV2 directory and copy the wrfsi cide

cp $group793/wrf/wrfsi_v2.1.1.tar.gz .

Untar and unzip the WRFSI software.

3. Get GEOG data (Not needed at this time on meso servers)

Now you must obtain all of your geographical data. You can copy these files from someone else’s directory that has already downloaded these files. You can copy the file GEOG.tar.gz:

On Meso Servers

cp /data/chnablu/WRF_WRFSI/GEOG.tar.gz

You will need to unzip and untar this file in your wrfsi/extdata directory.

On Bigdog

cp /$group793/wrf/GEOG.tar.gz

You will need to unzip and untar this file in your wrfsi/extdata directory

4. Wind Fix

You will need to untar and unzip this this file in your wrfsi/ directory to fix the wind for narr data

On Meso Servers

cp /data/chnablu/WRF_WRFSI/narr_si.tar .

On Bigdog

cp /lockers/PAMS/Volume1/gary/wrf/narr_si.tar .

5.Install wrfsi

On Meso Servers

Cd into the wrfsi/ directory and run the perl script

./install_wrfsi.pl

If this worked you should have an executable in this directory called “wrf_tools”

On Bigdog

a. Type “add pgi” in the terminal window (no quotes!)

b. Run perl script (found in /WRFV2/wrfsi/ directory) install_wrfsi.pl to install wrfsi.

(answer yes when it asks if you want to install the graphical user interface!)

c. If this worked you should have an executable in this directory called “wrf_tools”

3. Using WRFSI to set up your run

1. If on bigdog cluster, get on a compute node

Type qrsh at the command prompt

2. From /WRFV2/wrfsi/ directory, type wrf_tools

3. When WRFSI comes up, the first thing you will do is create your domain:

a. In the “Horizontal Grid” tab, draw your approximate domain the box provided and click “update map.” (unless your domain extends to very high latitudes, you should use LCC for your projection.)

b. Enter gridspacing and the horizontal grid dimensions, clicking update map again. Repeat until you have the grid you want.

c. Choose your vertical levels (make sure your ICs have data up to the level you choose for the pressure at the top of the model)

d. After checking to make sure that everything is pointing to the right data locations…

e. Localize your domain. (This actually creates the domain grid that you have specified.)

4. The second step is the “Initial Data” step. Here you will set up your run to see the initial and boundary conditions.

a. Type the path to each data source that you are using in the corresponding space.

b. Click on the script tab at the top, and choose your data source, editing the command line to reflect the forecast length in hours (-l), the time interval between boundary condition files in hours (-t), and the cycle time to use for the model simulation (-s) in YYYYMMDDHH format. Run this script for each data source that you are using (e.g., if you are using both Eta and a separate SST field, run the script once for each one.)

c. SST Data: If you are using a separate SST field, let –l be set to 0 and –t be set to 12 and HH be 00. Also, recent experiments suggest setting –l to the forecast length.

5. Finally, you are ready to interpolate your data. This is similar to the last step:

a. In the “Controls” tab, just make sure the data labels are all pointing to the right data. (Note: If using an additional SST field, type ‘SSTDATA’ into the “CONSTANTS_FULL_NAME” blank.

b. Click on the “Script” tab, and as before, modify the command line (only this time (-f) is the forecast length in hours, and the other two options are as above) [Depending on the size of your domain, running these scripts can take quite awhile so give them a chance!]

c. If this runs successfully you will have files that look like: wrf_real_input_em.d01.YYYY-MM-DD_HH:MM:SS in the /wrfsi/domains/yourdomainname/siprd directory.

4. Running WRF

[The data is already there to run WRF for one of the idealized test cases, and they can be run by looking at the instructions here:

To run a real case:

1. Move all of the “wrf_real_input_em.d01.YYYY-MM-DD_HH:MM:SS” files that were created in the last step of WRFSI into /test/em_real

2. Edit the namelist.input file (located in /test/em_real) to reflect your run specifications (see namelist.input hints at end of manual, or a description of the variables can be found online at: Also, the list of available model physics packages is in the README file in the WRFV2 directory). Also, the wrfsi namelist may help you with your domain dimensions, etc. It is located at: /WRFV2/wrfsi/domains/yourdomainname/static/wrfsi.nl.

a. Single processor mode:

To process the IC files, type

real.exe

And the files wrfinput_d01 and wrfbdy_d01 will be generated.

To run the model, type

wrf.exe

As the model runs, files called wrfout_d1_* will be created.

b. MPI mode

1. First, be sure that “nio_tasks_per_group” in the namelist.input file is equal to the number of processors you will be using.

2. real.exe can be run on a compute node from a terminal window, or using multiple processors using a script like the one found in /lockers/PAMS/Volume1/jmbilli2/scripts/innernest.cshWhen real.exe runs successfully, the files wrfinput_d01 and wrfbdy_d01 will be generated in the test/em_real directory.

3. Finally, to run the model, run wrf.exe. This can also be easily done from a script such as the one found in: /lockers/PAMS/Volume1/jmbilli2/scripts/wrf_run.sh. As the model runs, files called wrfout_d1_* will be created.

5. Performing a nested WRF run

Assumptions: First, I am assuming that you have the WRF model installed as well as WRFSI and are able to open WRFSI without any problems. I also assume that you have been successful in performing a simulation using a single domain. If either of those statements are not true, then you will need to look elsewhere for more help before trying to perform a nested run. Here I will highlight the differences between running a single domain and running nested domains, but will skip over most of the procedure that is identical. The procedure for making a nested run is generally similar to a normal run so you need to begin by running WRFSI...

Running WRFSI

Run the following steps on Bigdog

1. Create the outer domain:

After opening the program by typing wrf_tools, click on the domain selection button on the left hand side of the interface. Then you will want to either load your existing domain and place a nest within it, or create a new domain for your nested run. If you create a new domain you will first need to create the outer domain by dragging the cursor on the map, selecting the correct map projection and then clicking update map.

Next, you will see the domain which you have just created. This domain can be edited by clicking on one of the squares on the outer edge of the domain and dragging with the mouse. If you want to edit the grid in a more fine manner, to the right of the domain there is an option for the “fine scale editing mode”. If you select the option for “grid values” you can edit the number of horizontal grid points as well as the grid spacing.

At this point, you need to think about the grid spacing for your outer domain. If you will be performing a nested run with feedback on (i.e. data is passed between the grids) then you need to have an odd ratio between the outer domain and the inner domain(s). If you are leaving feedback off, this ratio can be odd or even. Either way, make sure that the ratio will be an integer.

As an example, the following grid spacings would be acceptable if you wanted feedback:

2 grids: 24 and 8km, or 21 and 7km

3 grids: 27, 9 and 3km

2. Create the inner nest(s)

After creating your outer domain and figuring out what grid spacing you are going to use for your nest(s), the next step is to create the nest(s). To the right of the map near the top, there are two tabs; MOAD domain and NEST domain. Click on the NEST domain tab to begin working on the nested domain.

First, you need to select the domain id. The first nest you create would have the id 2, a second nest would have the id 3, and so on for more nests. After selecting the id number, the parent id is automatically set as the domain that is outside of that inner domain. Next you can select the grid spacing ratio between the nested domain and its parent domain, keeping in mind that if you want feedback on this number must be odd. After those values are set, simply click on drag on the map to create the nested domain, like you did to create the outer domain. After creating the domain, you can edit it by changing the lower left and upper right grid values. Those numbers correspond to where the lower left and upper right grid points of your nested domain are located within the parent domain. Edit these numbers until you are satisfied with the location of the inner domain.

The procedure for adding more nested domains is identical to what you just did. To create a nested grid within the nest, simply go to domain id and click on d03. You will want to change the parent id to 2. After these changes, you would follow the same procedure as you did to create the first nested domain.

Finally, after you are done creating the domains click next. You then edit the vertical grid, the procedure here is similar to what you would do for a single domain run. For a nested run, you only specify the vertical grid dimensions once, and this will be used for all of the domains you are running. This means if you want to have very high vertical resolution on your inner-most domain, you will need to run all your domains with the same large number of vertical levels.

After clicking next again, you advance to the localization parms tab. Here you can see the values that exist for the domain configuration which you have already set up. Be sure that the num_domains variable is set to the total number of domains which you have created; i.e. 2 if you have one nested domain, 3 if you have 2 nested domains, etc. You may want to write down the values for DOMAIN_ORIGIN for the lower left and upper right grid points, because you will need these values later, or they can be found later on.

Next you will advance through the menus and localize the domain, which may take longer than it did for just 1 coarse domain so be patient.

The procedure for processing the initial data and interpolating the data is identical to what you do for a single domain. Before you interpolate the data, be sure that the following settings are correctly:

●Num_Active_Subnests: refers to the total number of nested domains you are using.

●Active_Subnests: should be the total number of domains in your model run.

When you are done with WRFSI, take a look at the output data, which is located in: /wrfsi/domains/*yourdomainname/siprd

You should have wrf_real_input_d01 files for the duration of your model run, as you would for a single domain model run. You should also have a wrf_real_input_d0* file just at the initial time for your model run, where the * refers to the number of the domain. If you have one nested domain, you will just have one wrf_real_input_d02 file at the initial time; if you have 2 nested domains, then you should have wrf_real_input_d02 and wrf_real_input_d03 files, and so on for however many domains you use.

**If you do not have the proper input files, check the num_domains variable in wrfsi and make sure the number there is the number of grids in your model run.

Copy or move these initial condition files to the WRFV2/test/em_real directory, and next these initial condition files need to be processed using real.exe.

WRF Procedure

1. Preparing and running real.exe for the nested domain(s)

Next, go to the test/em_real directory. You will need to run real.exe as before, but the procedure for running it with nested domains is somewhat tedious. You will need to run real.exe twice if you have one nested domain and 3 times if you have 2 nested domains. Each time you run real.exe you are just processing the initial condition files for one of the domains at a time. You will run real.exe for the domain with the smallest grid spacing first, and then end by running it for the coarsest domain.

1a. Rename some files

The procedure for running real.exe is the same as before, however each time you run it you need to be sure that the files and variables are set for the domain you are processing. Since real.exe will be looking for the wrf_real_input_d01 file, you need to rename the file you are trying to process as d01.

First, rename the actual wrf_real_input_d01 file as something else (i.e. original_wrf_real_input_d01) so you do not accidentally overwrite it. Next, rename the wrf_real_input_d02 file (if you have 1 nested domain) or the wrf_real_input_d03 file (if you have 2 nested domains) as wrf_real_input_d01. This way, real.exe will process the domain with the smallest grid spacing first.