Home About Free Downloads Register Help Videos More Stuff Site Map

Help!

Help!!
Help!!!

Continuous Runoff SCS CN Method with Wizards and Monte Carlo
For the FREE GetRealtime System Version 4.4.4
Updated Apr 8, 2024
Recent updates and fixes

(Note: This web page covers everything about GetRealtime.  I would suggest beginners interested in radar go to Site Map at top and download the Start Here Nashville example before digging thru all this here.


 

 

Note radar image N0Q has been replaced with N0B see here.

 

Program Setup

     Getting Started with Adjusted Radar Rainfall 7 

Advanced setup tips 7

 

Getting Familiarized with All Three Programs.

 

GetAccess.exe

Managing the database of the realtime and historical web data 7

Setting the Database Connection String (and SQL Server Express) 7

SITE LIST button. 8

PARAMETERS w/data button. 9

GO button. 9

Graph/Table button. 10

Excel Workbook “DBedit.xls”. 13

Excel Workbook “DBretrieve.xls”. 14

Creating Your Own Database… for these data sources. 15

Source codes in values tables. 10

Wunderground. 16

MesoWest Weather

California Data Exchange Center

US Conservation Service SNOTEL

US Bureau of Reclamation

US Army Corps of Engineers

California Irrigation and Management Information System

NWS AFWS gages

KISTERS QueryServices

NOAA Radar WSR-88D Imagery

Canada Radar Imagery

Canada Streamflow and Water Levels

SpotWx Forecast Models

Other ODBC Databases

Your Personal Weather Station or other Text File

HEC-DSS files using their MS Excel HEC-DSS Add-In

Datatype_ID Table

DB Tables. 28

 

 

GetRealtime.exe

Retrieval and computations of realtime web data 31

Setting the Database Connection String: 31

Description of the controls on the GetRealtime form: 32

Adding Sites to the GetRealtime Station List: 35

'Tools' Menu Command List File: 35

Computation Examples on the GetRealtime Station List: 37

Computation from other Data Bases (ODBC connected)

Rainfall-Runoff from Wunderground Gage

Quick Boundary Setup

Cookbook Setup for Radar Rainfall Adjustment and Runoff

Triangular Unit Graph

Linear Reservoir Runoff Hydrograph

Dimensionless Unit Hydrograph

Clark Unit Hydrograph

SCS Loss Methods

Hydrograph Recession Slope (specific yield or RecoveryFactor)

Directly Connected Impervious Fraction

Calibration of Coefficients

Fixing Base Flow or Bad Rainfall Events (Resetting Soil Moisture and Groundwater)

NEXRAD Radar Adjustment * (the most important thing of all)

NEXRAD Radar Tracking Nowcasts and QPF's with GetNexrad

NEXRAD Radar Point and Area Rainfall

Flow Routing

NEXRAD NVL, NET, N0S, N0V

Radar Image Maximums

NEXRAD Ridge2 Testbed

Snowpack and Melt Simulation

NWS snow model SWE and Melt

Multiple Model Runs with Multiple Traces, Monte Carlo, Stochastic Rainfall Generator

Alert on Retrieved Values

Write GetAccess to Hec-DSS files and automate HEC-Ras unsteady flow routings

Automate HEC-Hms runoff and routings

Automate HEC-ResSim reservoir simulations

Automate EPA-Swmm runoff and routings

Automate GetGraphs, GetNexrad, and .BAT shelling

NWS Forecast for QPF, Temp, Humidty, Wind Speed, Cloud Cover, Flow, Stage

Multi Radar Multi Sensor products from NCEP: A2M and HRRR

Lookup database values in rating tables

Station ID= IF for skipping and selecting computations

Station ID= SET-INTERVAL for changing Batch Interval run time frequency

Rain Gage Wind Speed Adjustment

Using Metric Units

 

 

GetGraphs.exe

Displaying the realtime data and web screens 44

Setting the Database Connection String: 44

Right mouse click on a graph or web screen to bring up menu. 45<

Adding Sites to the GetGraphs Setup

Add second series, show both Hourly's and Units

Add Trace from M-Tables

Flood Level Color Bands

Working with Web Screens: 1

GetGraphs Rainfall Frequencies

GetGraphs Pearson Type III Frequencies

GetGraphs Gage OffOn to Select Radar Adjusting Gages

Automate GetGraphs

 


Program Setup

 

Follow the instructions on the Free Downloads page to download the setup file and run setup.  If you will be using GetGraphs.exe ONLY, then you can skip all the GetRealtime and GetAccess sections and go directly to Working with Web Screens at the end of this document.

 

Note: After installing GetRealtime in C:\Program Files\ you probably want to copy these folders to a working project folder. You only need to install C:\Program Files\ to get the supporting DLL's. After that you can copy the program folders as you want and only update the exe's on the web download.

 

Tip: On Windows 7 or Windows 8 always run these setups as Administrator. From Windows Explorer right click on the setup.exe file and select Run as administrator. 

  

Also, PC's in Canada may have a Date Format that will cause problems. In your Control Panel be sure the Regional Settings for the short date is 'M/d/yyyy'.

 

See the Comments/Questions link at the bottom of this page for more connection help.

 

**********************************

Jump Start Tutorial for GetAccess, GetRealtime, and GetGraphs:

 

1) Download the setup file 'setupGetRealtime.exe' and run to install the programs GetAccess, GetRealtime, and GetGraphs.  If you do not have Microsoft Access or Microsoft Office installed on your computer then you can download the 'Download MSoA Support' zip file from the Free Downloads page, unzip and click on the file ACCESSRT.MSI to install the Access2003 runtime engine on your computer but I think it now comes with Win 7 and above.

 

2) Fire up GetAccess and a list of example stations should be displayed. If the list appears then GetAccess connection string is correct for this example and you can close GetAccess. If a connection error occurs then click the 'Connection' button and set the connection string to where the example 'GetAccessHDB.mdb' file is located (C:\Program Files\GetRealtime\GetAccess\GetAccessHDB.mdb). The full text line to enter would be:

Driver={Microsoft Access Driver (*.mdb)};DBQ=

C:\Program Files (x86)\GetRealtime\GetAccess\GetAccessHDB.mdb;Uid=Admin;Pwd=Me;

Or better yet (7/15/2012): 

Provider=Microsoft.Jet.OLEDB.4.0;Data Source=

C:\Program Files (x86)\GetRealtime\GetAccess\GetAccessHDB.mdb;User Id=admin;Password=;

Then click the 'SITE LIST' button to display the example stations.  

 

3) Fire up GetRealtime. Set the 'Days' to 7. Click 'Start Realtime Retrieval' button. This will retrieve the past 7 days of daily and hourly values for the 107 stations on the station list and write them to the GetAccess database. If you had to change the GetAccess database connection string in step 2 above, then before retrieving click the 'Select Station from List' button then click the 'Connection' button and set the connection string to what you set in step 2 and be sure to click the 'Save' button after changing the connection string.

 

4) After retrieving the 107 example stations data for the past week in step 3 we can view the data. Fire up GetGraphs and several graphs of rainfall data should be displayed as well as a web page in the upper right. Some of these example stations and web pages may no longer be available. Click on any graph to continue viewing more data graphs. If you had to change the connection string in steps 2 and 3 above then you will have to change GetGraphs connection also. To change the connection string, close GetGraphs, open Windows Explorer, find the GetGraphs installed directory (C:\Program Files\GetRealtime\GetGraphs). Double click on the file 'GetGraphs_setup.txt'. This will open the Notepad editor and you can change the connection string to that of steps 2 and 3 above. Windows Explorer can also be used to edit the setup files in steps 2 and 3 and may be preferred by most users.

 

5) That's it. After following the steps above you now have a good idea of what the rest of this help page is talking about.

**********************************

Getting Started Video:

  

 

 

 

Getting Familiarized with All Three Programs

 

The user should become familiar with each program.  For help see the programs section below for Program Use.  The following sections are as follows:

 

1) GetAccess.exe program use.

2) Creating your own database.

 

3) GetRealtime.exe program use.

4) Creating your own station list.

 

5) GetGraphs.exe program use.

6) Creating your own graph and web site list.

 

One should become familiar with the 3 programs in steps 1, 3, and 5 above followed then with steps 2, 4, and 6.  After following the examples for creating your own database, station list, and graph list then you will be ready to implement your own sites.

 

The hourly and daily data in the MS Access database should be updated for the past 7 days using GetRealTime.exe after reading GetRealtime section on how to do that.


Program Use

 

GetAccess.exe

(Managing the database of the real-time and historical web data)

 

Setting the Database Connection String:

 

If GetAccess.exe cannot make a connection to the Access database the following message will appear.  Set the connection string to the GetAccessHDB.mdb database using the Connection button.

 

 

The connection string should look similar to this depending on the path where GetRealtime.exe files were installed:

 

Provider=Microsoft.Jet.OLEDB.4.0;Data Source=

C:\Program Files (x86)\GetRealtime\GetAccess\GetAccessHDB.mdb;User Id=admin;Password=;

 

Note: Although it looks like I'm using the Jet driver with an ADO connection, I'm not. I found it was much faster to just get the mdb file name from the above connection string and use a DAO connection, meaning no driver overhead. I kept the full ADO connection string so as not to confuse others needing other drivers and ADO... but if your Jet connection string has "\\", "//", "|", then ADO is used with your full connection string.  The DAO vs ADO duplicate coding was not trivial but the speed was well worth it.

 

Update Aug 30, 2023: I've updated my desktop PC to Windows 10 pro and have tested PC to PC wi-fi file sharing that uses the path "\\" and works fine with the faster DAO so like "C:\..." above "\\MyPC\..." will use DAO.

 

Here is some background on ODBC connections and slick settings for a Azure SQL database: 

https://www.youtube.com/watch?v=UNVwOcP4vJs

and

https://www.youtube.com/watch?v=vnlqZcgRLm8

 

  

For Excel, the connection string would look like this:

Driver={Microsoft Excel Driver (*.xls)};DBQ=

C:\Program Files\GetRealtime\GetAccess\GetAccessHDB2.xls

 

Those using Excel as their database should open the GetAccessHDB2.xls file with Excel and read the 'ReadMe' worksheet to learn more about using Excel as a database.

 

Caution: The following is a third option and fraught with difficulties.

I don’t recommend trying this until you have worked with the Access database methods.

 

For an ODBC connection like SQLserver, you will need a driver, a DSN, and a connection string. GetAccess, GetGraphs, and GetRealtime can all use remote ODBC connections. For instance as a local SQLserver Express or as a remote MS Azure SQL database. All 3 can also run on the MS Azure Website cloud with the normal Access database mdb file and may be what you really want for web service info. GetRealtime can Put and Get just the data needed on a remote SQL server also. If you are not familiar with ODBC connections then you can send me an email so I can talk you out of it. ;-)

 

Upsizing your MS Access 2003 mdb file to SQLserver with MS Access will put a security freeze on your old mdb file's objects like table listing, and add another 80 mb to your computer's ram startup services for SQLserver Express (and probably max out your ram as soon as you use it with all it's sidekicks. Fixed with SSMS properties, memory limiting ram usage to 200mb). I wouldn't know how to backup this vaporware out there amongst the clouds so don't ask. And because of the ODBC ADO connection instead of DAO, running GetRealtime updates will take twice as long. The most important thing is to copy your GetRealtime mdb file to another folder, rename it, and test the upsizing; otherwise you just screwed the pooch on who knows what on your mdb file like I did. And if your MDB file was started before May 1, 2015, you need to add this table of tables to your database, before or after upsizing. They don't use the term 'Upsizing' for nothing...aka FAT SLOW COMPLEX!!!

 

Some gibberish that can help (you need SQL Server Management Studio a separate install):

http://www.fmsinc.com/MicrosoftAccess/SQLServerUpsizing/index.html

https://www.youtube.com/watch?v=9_3qgAxplW0

  

On the other hand, my GetRealtime and my SQLserver Express database seem to be buzzing right along. Also my Excel 2003 VBA ADO workbooks work just fine by changing just the connection string to SQLserver Express shown below.

 

UPSIZE your Access 2003 MDB file to SQLserver 2008 R2 Express installed:

1) Open mdb file with MS Access and run tools, Utilities, Upsize, New Database/w data. (Don't have MS Access?... try SQL Server Import and Export Wizard on All Programs.)

 

2)From All Programs, run MS SQL Server Management Studio. Connect to your SQLEXPRESS Server name and write this down for step 3. Under Databases you will see your newly created database name. On disk your new SQL database will be in C:\Program Files\Microsoft SQL Server\MSSQL10_50.SQLEXPRESS\MSSQL\DATA\*.mdf.

 

3) (Optional for complex settings, see below DSNless) Create a File Driver by opening Control Panel, Admin Tools, Data Sources (ODBC), Select the File DSN tab, ADD, select the SQL Server Native Client 10.0 driver, type in 'GetAccessFileDSN.dsn' or your choosing, Save, Next, and put it somewhere, and get the Server Name from step 2 like: MYCOMPAQ\SQLEXPRESS, and use the defaults and Test Connection. The default disk location is C:\Program Files\Common Files\ODBC\Data Sources\*.dsn.

 

4) In GetAccess_setup.txt, use a connection string like:

Provider=MSDASQL.1;Persist Security Info=False;Extended Properties="FILEDSN=GetAccessFileDSN;WSID=MYCOMPAQ;DATABASE=

GetAccessHDBSQL;Trusted_Connection=Yes"

 

More connection string examples:

See ODBC Drivers:  SQL Server Native Client 10.0 ODBC Driver

https://www.connectionstrings.com/sql-server/

 

Not using a DSN would be done like this: Driver={SQL Server Native Client 10.0};Server=MYCOMPAQ\SQLEXPRESS;Database=GetAccessHDBSQL;Trusted_Connection=yes;

 

5. Run GetAccess.exe and see what happens!!!

 

 

 

The Connection button can also be used to defrag and compact & repair the MS Access 2003 database and make a backup.

 

 

The before and after database compacting file size will be displayed.  The backup database file is named GetAccessHDB_backup.mdb.

 

Defragment or Compact First? If you compact a database after running a defragmenter on your whole disk, you theoretically leave open disk space immediately after the .mdb file on the disk, allowing the operating system to place any additional information in the succeeding physical clusters. This would be very fast. However, if you defragment after running Compact Database, your .mdb file may be placed on the first part of the disk followed by the rest of your files, with no open disk space until the end (the inside tracks) of the disk. This makes disk access somewhat slower.

 

I don't know what's right, but I just use the fragments info to start thinking about Compact and Repair. I never use DeFrag and I have never seen any speed difference regardless of anything here. Compact and Repair provides a backup file so that's good.  GetRealtime in Scheduled Batch has a checkbox for 'HDB Compact' which will compact and repair each day at midnight while running in batch mode and will provide another backup location.

 

 

Once connected, the example sites that are already in the GetAccess.mdb database can be displayed with the SITE LIST button.  The real-time data should be updated for the past 7 days using GetRealTime.exe so that the following examples can be followed.  You will need to get familiar with the programs using the provided examples because you will want to replace the example data with your own data (See Creating Your Own Database section below).

 

Notice also that an SQL statement is also displayed that was used to get the data from the database, in this case:

 

SELECT DISTINCT site_id, station_id, site_name, state, reach_id FROM rsite ORDER BY site_id;

 

Some users may wish to use SQL statements to interact with the database by checking the SQL checkbox and entering an SQL statement and then pressing the GO button.  Normally, all the queries needed are provided through the buttons, but SQL can be handy when deleting data from the database where the data can be SELECTed and then the SQL statement edited to DELETE or UPDATE data instead of SELECT data and removing the ORDER BY.  WARNING!!!!! DO NOT TRY THIS UNTIL VERY FAMILIAR WITH THE DATABASE AND HAVE MADE DATABASE BACKUPS!!!

 

 

 

To display retrieved data, select the time step Hour, select a station from the SITE LIST, and press the PARAMETERS w/data button.  The parameters available for the default date range of the past 7 days and the selected Hour time step will be displayed.

 

 

Select the parameter to display by clicking on the parameter row, in this case only one parameter is available for this time step and date range.  Press the GO button and the values in the database table rhour for datatype_site_id, date_time, value, and source are displayed

 

 

Notice again the SQL statement used for the database query.

 

To view the data as a graph or table press the Graph/Table/HDB button and the following form will be displayed:

 

 

Select GRAPH to view data as a graph.

 


Use the mouse Right Click button for the graph menu.

 

 


Select VIEW and checkbox Table to view data in an hourly table format.

 

 

Select SAVE or PRINT to save or print.


Sometimes bad or missing data will be retrieved and you may wish to edit it.  If several values are missing or bad, then UPDATE database with Excel can be selected to use the abilities of Excel to quickly fix or fill in missing or bad data, if you have Excel installed.

 

The provided Excel Workbook “DBedit.xls” will open and the data values automatically loaded into it.  After editing the data, select the revised data range and press the Update Selection to write the data to the database.  Since you have changed the hourly values you may wish to edit the daily value.  You can use the Excel auto sum (or average) of selected values displayed at the bottom of the Excel screen to obtain the new daily value.  When finished, close the Excel workbook and do not save the data, or you may save it with a different name if you wish.

 

 

 

Another Excel workbook that is provided that can be used as a stand alone for accessing the database directly is “DBretrieve.xls” with which you can both retrieve and edit the database values:

 

 

The connection string cell may need updating to something similar to:

 

Driver={Microsoft Access Driver (*.mdb)};DBQ=

C:\Program Files (x86)\GetRealtime\GetAccess\GetAccessHDB.mdb;Uid=Admin;Pwd=Me;

 

Or better yet: 

Provider=Microsoft.Jet.OLEDB.4.0;Data Source=

C:\Program Files (x86)\GetRealtime\GetAccess\GetAccessHDB.mdb;User Id=admin;Password=;

 

To test the connection, press the id list button and a list of all the data sites and parameters will be displayed.

 

You might find this workbook helpful if you need to create a workbook for processing the real-time data further, such as Crop water use for several crops from ET.

 

You may also be interested in the simple VBA code behind the Retrieve Data button and the Update Selection button.  You can view the VBA code in Excel by using the Excel menu Tools/Macro/Visual Basic Editor.  It may be desirable to change your current Excel workbook by adding similar code.  If you are not familiar with Excel buttons and VBA code then a little Googleing of Excel on the web should get you started.

 

There are several more options on the Graph/Table/HDB form:

 

'SAVE' will save values to text file or Excel.

 

'UPDATE database...' will load an Excel sheet for editig and then update the HDB database.

 

'VIEW' will view values as tables and can write Daily values to Monthlys and Yearlys 'Rtable's.

 

'PRINT' will print the values.

 

'DELETE Run...' maintains the 'Mtables'. Don't let the 'Mtables' get out of hand.

 

'DELETE zero...' will delete zero rainfall for the period selected but will not delete any in the last 7 days. GetRealtime can still process rainfall for runoff with the zero values missing so this just saves on the mdb file size.

 

'APPEND current...' will create and update an Access mdb database of the current database. This allows keeping a small current mdb of 10 days or so with an historical backup of all 'Rtable' data.

 

'DELETE data...' will delete all the 'Rtable' data older than so many days. Use this AFTER you 'Append current...'.

 

'GET from...' will retrieve data for all the 'Rtable's for the period selected for that DSID from the _History.mdb.

 

The last one, GET from... is in case you deleted your runoff starting conditions. And 'Rtables' mean ryear, rmonth, rday, rdaymax, rdaymin, rhour, runit, and runitshift. So if you add a NEW site on the current mdb then you need to add it to your *_History.mdb rsite table also.

 

For Azure websites I assume you could download the current MDB from Azure, open with GetAccess and it will create a file *_History.mdb. Then delete your downloaded MDB. Then on the Azure site you DELETE data older than 5 or 10 days depending on when your runoff startup is. Rinse & Repeat once a week. ALL COPMACT & REPAIRS are now automated so you should not have to worry about it ever again. The *\BackUp\*_History.mdb will be under your GetAccess folder. You could put a another copy of GetAccess.exe with its setup file in this back up folder for viewing your historcial data if ever need be.

 

You could also use 'APPEND' to merge multiple MDB files into the _History.mdb.



Creating Your Own Database… for your own data sites

 

Once you have become familiar with GetRealtime.exe, GetGraphs.exe, and GetAccess.exe with the example stations provided, you will then want to delete all the example data in the database and add your stations of interest.  Don’t delete the data in the table ExampleRsite because you may need to refer to it when adding info to your table Rsite for each type of data you decide you want to add.  Those using Excel as their database should open the GetAccessHDB2.xls file with Excel and read the 'ReadMe' worksheet to learn more about using Excel as a database. 

 

1)      Open GetAccess.exe.  We will be working with the GetAccessHDB.mdb file located in the GetRealtime directory that was used for the examples, so make sure you have a backup copy using Windows Explorer.

2)      We will proceed to delete all the data in the tables EXCEPT FOR THE TABLE EXAMPLERSITE.  You can copy and paste rows from ExampleRsite table to your new Rsite table and edit the appropriate fields if needed or right click on selected GetRealtime_setup.txt lines.

3)      For all the other tables follow this example for deleting all the data in Ryear.  Check the SQL checkbox and enter the SQL statement “DELETE * FROM ryear” and press GO.

 

 

Continue deleting data in all the database tables EXCEPT ExampleRsite.

 

DELETE * FROM ryear;

DELETE * FROM rmonth;

DELETE * FROM rday;

DELETE * FROM rhour;

DELETE * FROM runit;

DELETE * FROM rdaymax;

DELETE * FROM rdaymin;

DELETE * FROM runitshift;

DELETE * FROM rupdate;

DELETE * FROM rsite;

 

Tip:   To add the GetRealtime_Setup.txt lines to table Rsite, select a line and then right click to ADD  MOST info to table rsite or to update it.

 

The current version of the GetAccess database now includes additional tables for multiple trace modeling scenarios and begin with the letter 'm'. These tables are the same as the 'r' real data tables you will be using but include 'trace' and 'run_id' fields. Ignore the 'm' tables but do not delete them. They have been included  so that you will not have to worry about adding these modeling tables if they become needed.  Multi-scenario computations are described at the bottom of this page.

 

4)      You now need to locate the data you want to retrieve on the web to make sure it exists.  Each supported web source has its own Station IDs and Parameter Codes.  Once you determine the data exist you will also know the parameter code needed to retrieve it.  Here are the methods for finding the needed Station ID and Parameter Code for each supported data source:

 

List of SOURCE codes for the data sources used the values tables:

0=Canada Wateroffice

1=Wunderground

2=USGS

3=CalDEC

4=USCS Snotel

5=WRITE PROTECT

6=USBR

7=COE

8=CIMIS

9=RADAR

10=ROUTE

11=COMPUTE

12=SNOWMELT

13=FILE

13=real radar rainfall values from GETNEXRAD.exe

14=storm track rainfall Nowcast from GETNEXRAD.exe or 6-hour edit

15=NWS QPF rainfall forecasts from GETNEXRAD.exe

16=SHELL (example Hec-Ras routing)

17=COMPUTE-Get (data from other database)

18=FORECAST-NWS (hourly QPF)

19=AFWS

20=LOOKUP

21=KISTERS

22=

23=MRMS-A2M radar rain rate 2

4=MRMS-HRR Forecast

25=Spotwx's Forecast-RAP, HRRR, NAM, SREF, GFS, HRDPS, RDPS, GDPS, GEPS

26=PWSweather

27=MesoWest CWOP personal stations

28=NWS forecast of SWE and Melt Rate

 

 

Wunderground weather data https://www.wunderground.com/wundermap

 

Using your web browser, enter the web link.  Click on the WunderMap and you will be able to locate weather stations of interest to you.  You may need to zoom in or out to move around the country and find a weather station.  Once you locate the weather station, click on its symbol.

 

 

The latest version of GetRealtime.exe is using the 'OLD' parameter codes below with the URL query method.  Update 4/1/2020: Wunderground url retrieval is down again so download the scaping version and use the the NEW parameter codes below in the table Rsite.

 

You can download a Wundergage station for today using GetRealtime.exe and look at the top line in the file wonderin.txt for the new available parameters. Hopefully they wont change but should they you can check this file header line.

 

Common changes to Table Rsite parameters:

OLD >> NEW

dailyrainin >> precipTotal

TemperatureF >> tempAvg

Humidity >> humidityAvg

WindSpeedMPH >> windspeedAvg

WindDirection >> winddirAvg

GustSpeedMPH >> windgustAvg

DewpointF >> dewptAvg

PressureIn >> pressureMax

SolarRadiationWatts >> solarRadiationHigh

 

Note July 2, 2018: The Wunderground Airports are no longer available in a url request so I have gone to the metars available at Iowa Mesonet (USA only) or use MesoWest:

 

https://mesonet.agron.iastate.edu/request/download.phtml

 

 Here are the AIRPORT parameter codes you can use in table 'Rsite':

NEW=OLD

tmpf =Temperature F

dwpf =Dewpoint F

relh =Relative Humidity

drct =Direction

sped =Speed MPH

mslp =Pressure MB

p01i =Precip Hourly Inch

vsby =Visibility

The airport date/time will be in your computer's time zone. This is different than non-airport Wunderground stations which are in the stations local time.

 

Also note that rainfall accumulation parameter dailyrainin precipTotal is preferred over HourlyPrecipIn because some gages may be true hourly rates but others are a lagging moving 60 minute total, and some are really odd, so be safer than sorry and use dailyrainin precipTotal. Getrealtime.exe was updated 7/2/2012 to convert the accumulating dailyrainin values to unit increments.

 

Next if GetRealtime.exe is set to Metric=True then use conversion formulas now that  Wunderground stations are in Metric or English only.

  

******; *******; *******; ******* WUNDERGAGES ENGLISH***********

IONTARIO851; 11051; Rainfall; Georgetown (SW)

RSITE:

51; 11051; Georgetown(sw); 10; rainfall; in; inches; precipTotal; IONTARIO851; 0; ON

  

then....

  

******; *******; *******; ******* WUNDERGAGES METRIC ***********

COMPUTE-unit; 10051; Rainfall; Georgetown (SW); 0; 0; P1*25.4

RSITE:

51; 10051; Georgetown(sw); 10; rainfall; mm; millimeters; 11051; COMPUTE; 0; ON

 

=============================================

Or combine both steps above with out first storing the English values:

******; *******; *******; *******1-Step WUNDERGAGES METRIC ***********

IONTARIO851; 10051; Rainfall; Georgetown (SW); 0; 0; P1*25.4

RSITE:

51; 10051; Georgetown(sw); 10; rainfall; in; inches; precipTotal; IONTARIO851; 0; ON

  

If all your computations are to be Metric then use 'Edit w/Notepad' button and near the top set:   Metric Units=True

And use dailyrainMM precipTotal in table rsite.  But now ALL computations like solar and ET require metric inputs EXCEPT runoff setup coefficients.  You can use checkbox 'Write download text' to save the raw retrieved data to see what the parameter codes being retrieved are named. Canadian Water Office rainfall can be intervals so nothing to change but sometimes Canada rainfall can be year to date totals so you have to use parameter code 9 in table rsite.

Try Code Updates number 137 should Wunderground gages stop downloading again.

See if you can use a MesoWest station before resorting to WU.

 

MesoWest weather data   https://mesowest.utah.edu/

 

Using your web browser, enter the web link.  Click on the Map and you will be able to locate weather stations of interest to you.  Select APRSWXNET/CWOP stations, locate the weather station, click on its symbol and download the graphed data to file to get the Station_ID...(NOT Station Name) and the lat/long/elev.

 

Your GetRealtime_Setup.txt file station line would start with 'MW-' and the station ID:

 

MW-F2665; 10356; Rainfall; Peacock Mtn, SD

 

And table Rsite parameter_codes  (precipTotal)  are the same as the NEW Wunderground parameters above. You can also view the the top line in the file wonderin.txt for the available parameters. Hopefully they wont change but should they you can check this file header line. You can use the same computation examples as shown for Wunderground data.

 

MesoWest stations from non CWOP stations having a 'precip_accum_one_hour' parameter must use the parameter codes as given, NOT the new Wunderground parameters. See your download file wonderin.txt for these parameters.

 

US Geological Survey stream flow and water quality data:  http://waterdata.usgs.gov/nwis/rt

 

Using your web browser, enter the web link.  Click on the state where the station is located.  Near the top of the screen locate the Predefined Displays.  Using the drop down list select  either Stream flow Table, Precip Table, Reservoir Table, or Water Quality Table.  Hit the Go button to get the list of available Stations.  Click on the desired Station ID number to get a list of the available Parameter Codes.

 

 

The available real-time Parameter Codes for Station ID 094196781 on Flamingo Wash are 65, 60, and 45.

Unit values for real-time data and historical data since October 1, 2007.  Only daily values are available before that.

 

Update 1/18/2014--To change a 15-minute USGS flow time step to 5-minute steps add ', 5-min' on to the USGS station number on the GetRealtime_setup.txt:

 

094196781, 5-min; 1211; Flow; Flamingo Wash at Nellis Blvd

 

This allows adding a forecast to the USGS gage which the USGS gage can overwrite as the observed data becomes available. Why?... because now you can route and combine gaged flows with other sites 5-minute time-steps.

  

To get a USGS stations current rating tables you can use for GetRealtime's LookUp's: http://waterdata.usgs.gov/nwisweb/data/ratings/exsa/USGS.02458300.exsa.rdb

You may have to delete some rows to get down to GetRealtime's 2000 row limit.


 

California Data Exchange Center stream flow, reservoir, snowpack, and meteorological:  http://cdec.water.ca.gov/cgi-progs/staSearch

 

Using your web browser, enter the web link.  Locate the desired Station by checking River Basin and selecting the basin of interest, in this case American R:

 

 

From the list of available Station IDs select FOL for Folsom Dam and you will get a list of available parameter names and time steps.  The available hourly time step parameters are:

 

RESERVOIR ELEVATION, feet

(hourly)

DATA XCHG-USBR

From 01/26/1993 to present.

RESERVOIR INFLOW, cfs

(hourly)

DATA XCHG-USBR

From 12/09/1993 to present.

RESERVOIR OUTFLOW, cfs

(hourly)

DATA XCHG-USBR

From 12/09/1993 to present.

RESERVOIR STORAGE, af

(hourly)

DATA XCHG-USBR

From 06/24/1994

 

But this still doesn’t get us the Parameter Codes.

 

 


This link will give you a list of Parameter Codes:  http://cdec.water.ca.gov/misc/senslist.html

 

The numeric Sensor No is the Parameter Code we are after:

 

For our example of Folsom Dam reservoir elevation, the Station ID is FOL and the Parameter Code is 6.

  

FOL; 3402; Elevation; Folsom Dam, Ca

402 3402 Folsom Dam, Ca 3 elevation ft feet 6 FOL 4 CA

Note: CDEC download data is PST standard time (see 'Write download.txt'). CDEC interactive web page tables may in error show PST data as PDT so watch out.

 

 


US Conservation Service SNOTEL snowpack, and meteorological data

  

SNOTEL Site List for StationID:

 https://wcc.sc.egov.usda.gov/nwcc/yearcount?network=sntl&counttype=statelist&state=

 

Using your web browser, enter the web link for a list of available Station ID’s.  The numeric Site ID is the value we are after.  For Alexander Lake, AK the Site ID is 1267 on site_name.

 

The available SNOTEL parameters  are usually Snow Water Equivalent, Snow Depth, Precipitation, and Temp.

  

Parameter Codes:

AIR TEMPERATURE OBSERVED = TOBS

PRECIPITATION ACCUMULATION = PREC

SNOW DEPTH = SNWD

SNOW WATER EQUIVALENT = WTEQ

 

 Knowing the StationID and State and parameter codes we can add them like this.

GetRealtime_setup.txt examples:

341; 17138; Temperature; Big Red Mountain, Uscs Snotel 6050 FT

341; 21138; Snow Depth; Big Red Mountain, Uscs Snotel

341; 23138; Snow Water Content; Big Red Mountain, Uscs Snotel

341; 10138; Precip; Big Red Mountain, Uscs Snotel

 

(NOTE: Use data_type 9 in rsite And Datatype_Site_ID 10xxx to convert Total to Increments)

  

The table rsite setups:

 

You can manually check your SNOTEL parameters and data here:

https://wcc.sc.egov.usda.gov/reportGenerator/

SNOTEL data are reported in PST, Pacific Standard Time (UTC-08:00), for sites in all western states, except Alaska. Alaska SNOTEL sites are report in YST, Yukon Standard Time (UTC-09:00).

 


US Bureau of Reclamation reservoir, stream flow, evapotranspiration and meteorological:   There are three USBR regions providing real-time data, Great Plains Region, Pacific Northwest Region, and Lower Colorado River. 

 

Starting with the Great Plains Region’s Hydromet:

http://www.usbr.gov/gp/hydromet/station_list_by_state.cfm

 

Using your web browser, enter the web link for a list of states covered.  Choose a state from the second list, Real-Time values (one value every 15, 30, or 60 minutes).  For this example North Dakota is chosen.

 


The available stations and Parameter Codes are listed for North Dakota:

 

 

For Jamestown Reservoir, the Parameter Code for Reservoir Elevation is FB, but we still need the Station ID.  Click on the Jamestown Reservoir link to find the Station ID:

 

In this example, the Station ID for Jamestown Reservoir is JAMR.

The USBR provides both 15-minute and 60-minute time steps.  Some values are only available as 60-minutes.  Using your web browser retrieve the data and see which time step you want and then the Parameter Code you will use is 15-FB or 60-FB for reservoir elevation.

 

 

The Pacific Northwest Region Hydromet and Agrimet:

http://www.usbr.gov/pn/hydromet/decod_params.html

 

 

Both the Station ID and available Parameter Codes are listed... but it’s a long list.  For air temperature at station Afton Wyoming, the Station ID is AFTY and Parameter Code is OB.  You may wish to look around on the USBR Hydromet and Agrimet web pages for easier ways to find stations.  Try retrieving the data to determine the time steps available.

 

The USBR provides both 15-minute and 60-minute time steps.  Some values are only available as 60-minutes.  Retrieve the data and see which time step you want and then the Parameter Code you will use is 15-OB or 60-OB.

 

USBR  Lower Colorado Basin Region

The USBR Lower Colorado River Region provides hourly values updated each hour at the sites below.

 GetRealtime_setup.txt could look like this:

 

USBRLC; 3312; Elevation; Senator Wash Reservoir

 

and table Rsite like this:

 

312; 3312; Senator Wash Reservoir; 3; elevation; ft; feet; elevation; Senator; 3; AZ; -1; 

 

Note that all USBRLC times are Arizona Mountain Standard Time (no daylight savings) so use state code AZ for sites in California and Nevada and adjust your Rtable time shift to suit your location each spring and fall..

 

USBR Lower Colorado JSON Real-time Data_Sites and Data_Types:

Select a partial Name and partial DataType that are unique with case matters:

example: Rtable site_id= Senator   and   Rtable parameter_code= elevation

 

Lake Mead,DataTypeName:storage, end of period reading

Lake Mead,DataTypeName:average power release

Lake Mead,DataTypeName:reservoir ws elevation, end of period primary reading

 

Lake Mohave,DataTypeName:storage, end of period reading

Lake Mohave,DataTypeName:average power release

Lake Mohave,DataTypeName:reservoir ws elevation, end of period primary reading

 

Lake Havasu,DataTypeName:storage, end of period reading

Lake Havasu,DataTypeName:average power release

Lake Havasu,DataTypeName:reservoir ws elevation, end of period primary reading

 

Senator Wash,DataTypeName:storage, end of period reading

Senator Wash,DataTypeName:reservoir ws elevation, end of period (secondary reading)

 

Imperial Dam,DataTypeName:reservoir ws elevation, end of period primary reading

 

Colorado River above Laguna Dam,DataTypeName:reservoir ws elevation, end of period

Colorado River Below Laguna,DataTypeName:average gage height

Colorado River below Laguna Dam,DataTypeName:average flow

 

Brock Reservoir,DataTypeName:storage, end of period reading

Brock Reservoir,DataTypeName:average total release (sum of all methods)

Brock Reservoir,DataTypeName:average inflow

 

Colorado River Below Big Bend,DataTypeName:average flow

Colorado River Below Big Bend,DataTypeName:average gage height

Colorado River Below Needles Bridge,DataTypeName:average flow

Colorado River Below Needles Bridge,DataTypeName:average gage height

Colorado River at River Section 41,DataTypeName:average flow

Colorado River at River Section 41,DataTypeName:average water level, as measured by

Colorado River at Parker Gage,DataTypeName:average flow

Colorado River at Parker Gage,DataTypeName:average gage height

Colorado River at Water Wheel,DataTypeName:average flow

Colorado River at Water Wheel,DataTypeName:average gage height

Colorado River Below Interstate Bridge,DataTypeName:average flow

Colorado River Below Interstate Bridge,DataTypeName:average gage height

Colorado River Below McIntyre Park,DataTypeName:average flow

Colorado River Below McIntyre Park,DataTypeName:average gage height

Colorado River at Taylor Ferry,DataTypeName:average flow

Colorado River at Taylor Ferry,DataTypeName:average gage height

Colorado River Below Oxbow Bridge,DataTypeName:average flow

Colorado River Below Oxbow Bridge,DataTypeName:average gage height

Colorado River at Cibola Gage,DataTypeName:average flow

Colorado River at Cibola Gage,DataTypeName:average gage height

Colorado River at Picacho Park,DataTypeName:average gage height

Colorado River at Picacho Park,DataTypeName:average flow

Colorado River at Martinez Gage,DataTypeName:average gage height

Colorado River at Martinez Gage,DataTypeName:average flow

Colorado River at Yuma Gage,DataTypeName:average gage height

Colorado River below Yuma Main Canal Wasteway - Yuma Gage - USGS,DataTypeName:average flow

Colorado River at NIB Gage,DataTypeName:average flow

Colorado River at NIB Gage,DataTypeName:average gage height

 

USBRLC Scheduled Releases can be downloaded from their USBR 'model=2' hydrologic database using Station Datatype ID (SDI): Example SDI's:

SDI 14646: HEADGATE ROCK DAM / LAKE MOOVALYA - AVERAGE BYPASS RELEASE in CFS

SDI 14650: HEADGATE ROCK DAM / LAKE MOOVALYA - AVERAGE POWER RELEASE in CFS

GetRealtime_setup.txt:

USBRLC-14646,14650; 1601; Scheduled Flow; Colorado R at Parker (headgate power+bypass)

GetAccess Rtable:

Schedules are assumed to use one of your existing history rtable lines in my case here 1601 flow gage history. Retrieve schedules before historys.

If two or more SDI's are given GetRealtime will sum them. For additional SDI's contact Boulder Canyon Operations Office: bcoowaterops@usbr.gov.

 

US Army Corps of Engineers reservoirs, canals, streams and weather data:   http://www2.mvr.usace.army.mil/WaterControl/new/layout.cfm

 

 

 

Find your station of interest and display the data in order to determine the station ID code and parameter name:

For this example you would select Station ID from your web browsers URL web page address. The Station ID is highlighted above and is 'CCLK2'. The parameter code is highlighted at the top of the data fields table and is 'Stage' and is case sensitive. You will use these two codes in the GetRealtime_setup.txt and your GetAccess HDB rsite table:

  

 GetRealtime_setup.txt:  (Add COE- to Station ID)

 

GetAccess HDB rsite table:  (But not here, just CCLK2)

 

 

 

California Irrigation and Management Information System evapotranspiration and meteorological:   http://wwwcimis.water.ca.gov/cimis/infoStnMap.jsp

 

 

Find Station ID 80 in the San Joaquin area and click on it:

 

We have the Station ID 80.  The Parameter Codes are on the following table:

 

Parameter Code

Name

Units

1

Station Id

 

2

Date

 

3

Hour

 

4

Julian Date

 

6

Reference ETo

inches

8

Precipitation

inches

10

Solar Radiation

Langley’s/day

12

Vapor Pressure

millibars

14

Air Temperature

F

16

Relative Humidity

%

18

Dew Point

F

20

Wind Speed

mph

22

Wind Direction

degrees

24

Soil Temperature

F

31

Wind Run

miles

 

ETo for the station Fresno has a Station ID of 80 and a Parameter Code of 6.

Getrealtime_setup.txt:

CIMIS-080; 27405; ET Grass; Fresno, CA

Table rsite:

405 27405 Fresno, CA 27 ET inches inches 6 CIMIS-080 4 CA

 

 

NWS AFWS Precip Gages -- The Automated Flood Warning System rain gages can be located on their map at:

   http://water.weather.gov/afws/

  

You will need the gage ID for your setup. For example the GetRealtime_Setup.txt line could be:

 

AFWS-wprn7-pdt; 10910; Precip; Westport Golf Course, NC

 

Where wprn7 is the gage ID and pdt is my time zone id (Pacific Daylight Time) even though this gage in North Carolina is in the Eastern Time Zone. I'm not sure what happens when PDT turns to PST so I will have to update this soon. I don't know how to get the Lat/Long's of the gages yet but I will update this when I do.

 

The GetAccess table rsite example line is:

 

For AFWS sites with stage, then a stage setup could be:

 

AFWS-dscv2-pdt; 2911; Stage; Unnamed Trib at Dulles Station, VA

 

where the datatype_id= 2 and parameter_code= Stage.

 

KISTERS QueryServices - Request Information:  KISTERS database can be accessed by URL like: http://kiwis.grandriver.ca/KiWIS/KiWIS?service=kisters&type=queryServices&request=getTimeseriesValues&datasource=0&format=Ascii

&ts_id=12460042&from=2015-10-23&to=2015-10-24

 

GetRealtime.exe will replace the needed info in this URL example for a current retrieval.

GetRealtime_setup.txt:

KISTERS; 10049; Rainfall; Frostback Falls, ON

 

Note that some servers are http: and some https: so the whole server needs to be in table rsite's station_id like:  http://kiwis.grandriver.ca

GetAccess table Rsite:

Note the ts_id goes in parameter_code and http://kiwis.grandriver.ca goes in station_id.

If a username and password needed then the table rsite's paramer code would be: 12460042,myname,mypassword

 

The Kisters Zulu time will be automatically converted to your PC's local time.

 

 

NOAA NEXRAD WSR-88D class radar imagery--These images are provided free on the web and are updated about every 5 minutes:   http://radar.weather.gov/

 

 New GetNexrad 3.6.0--Iowa State University Mesonet historical images can now be directly loaded for a loop of up to the past 24 hours for Ridge2.  Historical radar images beginning Jan 1, 2012 can be downloaded for 24 hours, 1 day at a time.

 

Update May 20, 2022 GetNexrad 4.4.1--The NWS has discontinued the N0Q product sent to Iowa Mesonet and has been replaced with N0B. I don't see the point in this N0Q to N0B update but... Update your N0Q png image to N0B png image using GetNexrad.exe. If in your \GetNexrad folder your world files *Q.pgw and *B.pgw look the same you can just rename all your Boundary and Point files *Q.txt to *B.txt. Or you can redo them with LatLongPixels.exe and LatLongPixelsFromFile.exe that have been updated May 20, 2022.

Like this:

NexradBoundaryYUX-10101Q.txt >>> NexradBoundaryYUX-10101B.txt

 

Iowa Mesonet N0B: https://mesonet.agron.iastate.edu/onsite/news.phtml?id=1431

The new high resolution US mosaics still use the N0Q name.

 

Typical GetRealtime_setup.txt line for more modern N0Q radar:

 

NEXRAD-ESX; -10363; Rainfall; Burro Creek Sub 3; 0

Where 0 is for convective dbz to rain.

Table rsite would look like this:

AND NEVER APPLY TIMESHIFT TO RADAR.  I's automatically set to your PC's time and for history daylight savings periods ware adjusted.

 

Radar Images From File like NOAA Weather and Climate Toolkit KMZ files can be read using these steps. You may want to use GetNexrad first to see what you got and follow the GetNexrad help instructions for KMZ files.

 

Level II is 0.25 km x 0.5 degree x 0.5 dbz and you would catch hell from ROC for mistating this! This includes both reflectivity and velocities. N0Q averages the level 2 into 1km grids and from my comparisons N0Q may be a slightly better choice for adjusted rainfall due to hitting a gage from 20,000 ft with a 1km vs 0.25km shot.

 

To read radar images from file:

1) Be sure the radar image folder has the boundary and point file for the station or you can add the boundary and point file location to your GetRealtime_setup.txt file like this:

Radar Boundary File folder=C:\GetBMX\GetRealtime\N0R\

2) Select the radar station(s) from the GetRealtime station list.

3) Check 'Scheduled Batch' and select option 'Read radar from file'

4) Uncheck 'Scheduled Batch'

5) Click 'Start Retrievals' and select radar images to read.

Note that KMZ universal Zulu file times will be converted to your time and adjusted for daylight savings.

 

If you have more than 4 or 5 days of radar images to select, then you will have to create a file list of the radar png's. I would imagine you could process 1 to 2 million radar images at a time using this file list, about 10 to 20 years but you should break it into start and ending day light savings because GetRealtime only checks the start date.  0_* source png's are already time zone and DST converted so this doesn't matter.

 

 To create a file list of the radar names:

1) Open a console window.

2) Change directories to the radar folder.

3) Enter the command line: dir 0_* /b > filelist.txt

4) Exit.

Your radar file list name filelist.txt should look like this:

 

0_ESX_N0Q_201506160907_0.000.png

0_ESX_N0Q_201506160916_0.000.png

0_ESX_N0Q_201506160926_0.000.png

0_ESX_N0Q_201506160936_0.000.png

0_ESX_N0Q_201506160946_0.000.png

Use the 'Scheduled Batch' menu as above and check the 'File List' also and then uncheck 'Scheduled Batch' to watch.  If you need to include multiple paths for your radar images then you can include the full path with each file name but you will have to sort the file list by date yourself.

 

The rsite table paramter N0Q-Ridge2 has to be correct for realtime downloads. It tells GetRealtime to download N0Q images and Ridge2 means add png instead of gif to image request. Not so (rsite parameter not used) when selecting files on disk or using a file list, everything is determined from the file name. Either saved files, KMZ, and other sources. The point file is then selected based on DSID, ESX, N0Q is expected to be in the same folder as the radar images... unless you use 'Radar Boundary File folder=' in setup. And for huge file list runs close the radar image view window for paint blistering speed.

 

If your GetRealtime_setup.txt has the 'Radar Boundary File folder=...' then the real-time latest radar image will be saved to this boundary folder so you should point GetNexrad.exe to this boundary folder to view current radar with boundaries. And you can delete your boundary and point files in the GetRealtime.exe folder to clean things up.

  

The radar imagery provides for computation of area average rainfall amounts for any area in the USA.  This is a new addition and is described at the bottom of this page.

 

Canada Weather Radar:  The 31 Canadian C band randars have been added and are accessed just like N0AA radar data. The product code is "C0R"  (that's C-zero-R, not C-oh-R) for 0.3 degree base reflectivity (10-minute, 1 km, 2.5 dbz). The radar images are available for the past 48 hours. Historical radar images can be downloaded by hand from http://climate.weather.gc.ca/radar/index_e.html. and GetRealtime can read them from file (you may have to format the filename?).  The Canada radar images also come in higher resolution at the lower reflectivity's for snowfall. If you are interested in constant altitude precip scans called CAPPI then you can use GetNexrad to load those but I would stick to the base 0.3 reflectivity unless ground clutter near the radar site is a problem.

 

New Canada radar ID's Nov 22, 2020RadarSites.txt

Canada Radar has 3 different reporting types:

1) New S band 6 minutes id= Sxx

2) Old C band 10 minutes id=Wxx, Xxx

3) Composite10 minutes id= Sxx, Wxx, Xxx

 

For composite radar images add '-comp' to the Station like NEXRAD-SFW-comp. One dumb thing about composite images is that they used the same red color for circles as used for the red rain color, not too bright hey! If your radar boundary intersects with these dumb lines then you can manually adjust your point file to omit these dumb lines or write your congressman.

 

GetRealtime_setup.txt Canadian RAIN example:

NEXRAD-WHK; -10980; Rainfall; Edmonton, AB  

  

GetRealtime_setup.txt Canadian SNOW example:

NEXRAD-WHK; -10980; Rainfall; Edmonton, AB; 1

 

GetAccess table 'rsite'GetAccess table 'rsite' setup example:

 

If you want to mix Canada 10-minute radar with Nexrad 5-minute radar for the same runoff routing time step then add -5min to your setup like this:

NEXRAD-WHK-5min; -10980; Rainfall; Edmonton, AB

 

Why RAIN and SNOW radar products??? Beats me, but I bet it is similar to choosing Convective or Cool Season Z-R's in the US. So you are living on the cutting edge up there so figure it out yourself.  And yes, GetRealtime does snow accumulation and melts for you lucky guys up there.

  

 

 

Canada Streamflow:  The real-time streamflow and water level data collected at over 1600 stations across Canada has been added to GetRealtime.  I have automated the Wateroffice 'I Agree' button to their Disclaimer Screen that says you must use their real-time stream data with caution at your own risk. In other words it is provisional and subject to revision.  Winter ice in particular can affect water levels and cannot be accounted for in real-time.

 

Update Mar 14, 2017: Canada full streamflow data no longer supported. Permission denied!!!  Only parameters 46 stage and 47 flow supported for past 3 days.

 

Update 12/14/2014: The Water Office has changed some things so you may have to use your webbrowser to manually access 1 gage to prime the cookies pump. If the cookie expires you may have to repeat. Unlike everybody else in the water data business, Canada won't let just any taxpaying fool access their data with out consent of their lawyer and doles out cookies to us poor beggars.  Also I believe the flow and G.H. parameter codes have changed to those below.

 

GetRealtime_setup.txt examples for streamflow and water level:

02HC025; 1984; Flow; HUMBER RIVER AT ELDER MILLS [ON]

02HC025; 2984; Gage Height; HUMBER RIVER AT ELDER MILLS [ON]

 

GetAccess table 'rsite' setup example:

 

So far I have found these parameter codes:

46=gage height (M)

47=flow (CMS)

1=air temperature (C)

5=water temperature (C)

18=total rainfall (MM)

19=increment rainfall (MM)

21=conductivity (S)

25=turbidity (JTU)

28=humidity (%)

34=wind direction (deg)

  

You can convert total rainfall parameter code 18 (datatype 9) to incremental rainfall by using datatype=10 but that can be risky if the total rainfall has diurnal fluctuations.  Wind direction is converted from degrees to 0-4.

  

The SpotWx forecast models are all now available.

SpotWx

Update Sep 16, 2020: SpotWx may require user to manually save data to html file and enter the file name in setup's formula1 field.  You could SHELL your web browser and then save html to file.

 

****; **; **; **HRRR Forecast Spotwx Run Manually with Your Web Browser**

 

SHELL-"C:\Program Files\Internet Explorer\iexplore.exe" "https://spotwx.com/products/grib_index.php?model=hrrr_wrfprsf&lat=29.61&lon=-95.38&tz=America/Chicago&display=table"; 0; Webbrowser; Save Spotwx to File

 

Check the model types here: https://spotwx.com

 

Source code 25= Spotwx Forecast RAP, HRRR, NAM, SREF, GFS, HRDPS, RDPS, GDPS, GEPS

 

RAP and HRRR are updated hourly and will be checked every half hour, the other models will be checked every 6 hours.

 

GetRealtime_setup.txt:

FORECAST-RAP; -11348; Forecast Rain; Burro Cr; 0; 34.71,-113.17, 6

   or

FORECAST-RAP-10min; -11348; Forecast Rain; Burro Cr; 0; 34.71,-113.17, 6; C:\GetRealWunder\SpotWx Tabular model data.html

 

The forecast ending hour values will be used to fill the preceeding 5 minute steps, or your duration if added like -10min. The forecast is time shifted to your PC's time zone so no time shift is needed. The ending 34.71,-113.17 are the lat, long for your forecast point.  If you have multiple subs within a few miles use the same lat/long for all the subs so you don't have to download the same forecast multiple times.  The ending 6 is how many hours of the forecast to use. If missing or 0 then all forecast hours are included.

 

 Because you will probably be writing the forecast to your adjusted radar DSID, no additional rsite table info is needed. If not, you need to have the usual rsite info with anything for parameter_code and station_id.

 

The datatype_id's are listed in GetRealtime_datatypes.txt:

TMP=17 DPT=16 RH=18 WS=28 WD=19 WG=20 APCP=10 or 11

  

I don't have a dog in this but RAP is updated hourly, HRRR is always 2 hours behind. Make of it what you want.  In fact I'm so happy with 15-Minute HRRR radar images from Iowa and NWS 7-day QPF that I don't use any of this stuff.  And of course I use my radar Nowcasts to replace hours 1-3.  Actually, the speed of SpotWx compared to downloading all the HRRR images is so much faster that I think I will start using SpotWx HRRR instead.

  

 

Your Personal Weather Station or Other Text Files--Text files from your weather station can be added to the Access Database.  The text file must have a date and time and value columns and can be either Space, Comma, SemiColon, or Tab delimited. Below is an example of a Davis weather station day file:

 

  

If your date format begins with the day like dd/mm/yy as above then you will need to add a date format to the GetRealtime_setup.txt file base1 field like this: (If your date does not begin with dd and uses '/' and is in column 1 then nothing is needed.)

 

The Station_ID cell contains 'FILE-n' where n is the data column to write to the database. The Date is column 1, Time is column 2, and so the first data column would be 3... etc. The number and content of the header lines is not used and can be what ever or none at all as long as the first non space character is not numeric.  Also if error retrurns "No data for period" try FILE-2 or 1 less or more than you think and check the values returned are the column you want and adjust FILE-n accordingly.

 

The GetAccess HDB database table 'rsite' below has the text file name in the 'parameter_code' and the 'station_id' is just 'FILE':

 

Example for precip accumulation total that resets or not and converts total to increments:

1) put ...\MyData.txt in rtable parameter_code

2) put 9 in rtable datatype_id and use 10xxx as your datatype_site_id.

3) GetRealtime_setup.txt line reads:

FILE-7; 10875; Rainfall; Test; m/d/yyyy

4) Set Days to 20 or whatever to cover your period of interest.

5) Run and now open the created file wunderin.txt with Excel (has no carriage return) to verify what column is being read.

 

 

Cumulus from Sandaysoft documentation says it's monthly log file (Aug09log.txt) gets updated about every 10-minutes:

 http://wiki.sandaysoft.com/a/Monthly_log_files

 

So GetRealtime should be able to read their (;) or (,) delimited file as well... but remember the format is 'dd-mm-yy' or 'dd/mm/yy' and has to be added to the GetRealtime_setup.txt because of this screwy dating convention... just like Davis.

 

EasyWeather.dat file:

The EasyWeather data file has it's date in the 3rd column so the base1 cell in GetRealtime_setup.txt would look like this:  yyyy-mm-dd, 3

 

You may find that many of your weather data types are not listed below on the table of datatype_id's. You can select any datatype_id that suits your averaging/cumulation/total needs and then change the unit_name to what ever you would like.  You may need more than one site_id if you still need more datatype_id's or want to use the same datatype_id more than once.

 

MS Excel data:

Copy from Excel, paste, and save your text file with Notepad. As an example if only a date and 1 value column were being saved then:

 

Getrealtime_setup.txt:

FILE-1; 1101; flow; Test Using Excel Text File

In table Rsite you need to put the text file name in parameter_code and FILE in station_id just like above.

 

The GetAccess Excel files DBedit.xls, DBretrieve.xls, and the More Stuff GetFlowRecord.xls can also be used to manually add and retrieve GetAccess database data and can show beginners how to get started with VB database code using Excel.

 

 

HEC-DSS files using their MS Excel HEC-DSS Add-In--Write Excel files with HEC-DSS header info for storing to HEC-DSS files using the Excel HEC-DSS AddIn.  You can also write HEC-DSS to GetAccess by using the MS Excel HEC-DSS Add-In to read data into DBedit.xls and then write the data to GetAccess.  Update 1/8/2014--GetRealtime can Put and Get Hec-DSS files directly. See below.

 

The MS Excel HEC-DSS Add-In can be downloaded for free here: http://www.hec.usace.army.mil/software/hec-dss/excel/

  

 

END OF SOURCES

 

====================================== 

5)      You are now ready to add your new station name and data type to the table Rsite, but first you will need to determine the new station’s Site_id and Datatype_id.  These 2 values will then be used to create a unique Datatype_Site_ID that is stored with each retrieved value.

 

Site_Id's= 0^00 where 0=area or reach, 00=count in reach.  Site_Id's are 0 to 999.

 

Datatype_id 1= flow, cfs

Datatype_id 10= rainfall, inches, etc,… (See table of Datatype_id’s below)

 

Datatype_Site_ID (DSID) = 00^000 where 00=datatype_id and 000=Site_id.

Valid DSID’s are integers in the range -32,768 to 32,767

 

Example: DSID 10212 = rainfall, area 2, 12th station

Example: DSID 1212 = flow, area 2, 12th station

Example: DSID -1212 = Computed flow, area 2, 12th station

 

Use negative DSID's for values that are computed by GetRealtime.exe to keep them separated from the web source reported values and also to note it has been computed.

 

You might use another method for creating the Site_id and associated unique DSID but remember the valid DSID integer range -32,768 to 32,767.

 

Datatype_id’s are associated with formatting and averaging methods.  If  more than 32 (64 if negatives)datatypes are needed, then you can break up your station into 2 or more site_id's.

If you are retrieving metric values or converting English to Metric then it’s ok to change the unit_names to metric and for that matter even the datatype_name.  Just remember that datatype_id's have certain rounding and averaging methods.  Bearing this in mind, you could make your own data table with completely new parameters and units.  If anyone would like to share their alternative datatype_id table just add it to the comments button below or email me it and I will add it here. 


                        Table of datatype_id’s

datatype_id datatype_name unit_name rounding averaging
1 flow cfs usgs avg
2 gage height ft 0.00 avg
3 elevation ft 0.00 ending
4 contents kaf usgs ending
5 contents % capacity 0.0 avg
6 specific conductance umhos usgs avg
7 water temperature °f 0.0 avg
8 ph std units usgs avg
9 total precip inches 0.00 ending
10 rainfall inches 0.00 increment
11 rainfall inches 0.00 increment
12 turbidity fnu usgs avg
13 dissolved oxygen % saturation usgs avg
14 tds mg/l usgs avg
15 pressure in hg 0.00 avg
16 dew point °f 0.0 avg
17 air temperature °f 0.0 avg
18 humidity % saturation usgs avg
19 wind direction 0.25 weighted
20 wind gust mph usgs avg
21 snow depth inches 0.0 avg
22 absolute humidity g/m3 usgs avg
23 snow water content inches 0.00 ending
24 inflow cfs usgs avg
25     NONE avg
26 cloud-amount % of avg 0.0 avg
27 ET inches 0.00 increment
28 wind speed mph usgs avg
29 solar radiation Langley’s/day usgs avg
30 runoff cfs usgs avg
31    NONE avg
32 nexrad maximum what ever usgs avg

 

Always use 10xxx for rain gages, -10xxx for radar, -11xxx for adjusted radar and 11xxx for snowmelt. These datatype_id's are used by GetNexrad for proper display.

 

  10xxx = Rainfall

-10xxx = Radar

-11xxx = Adjusted Radar

  11xxx = Snowmelt

  

To convert total rainfall datatype_id=9 to incremental rainfall use datatype_site_id=10xxx with datatype_id=9.

  

Important: Datatype_name 'ratio' is reserved for rainfall adjustment and usually associated with datatype_id=31.

Datatype_id's with special associated computations: 19=wind direction, 27=ET, 29=solar/long radiation, 30=runoff, 32=radar image maximum value.

For runoff datatype_id 30 computations, daily datatype_id's 4, -4, and 5 end of day conditions will be written secretly so be aware. Runoff datatype_ID's in rday: datatype_ID 4 = Soil Storage or Initial Loss. datatype_ID -4 = Precip or zero. datatype_ID 5 = Groundwater flow.

Note:  Datatype_id's 10 & 11 rainfall and 27 ET unit values rounded as 0.00000 and hourly's are 0.0000

USGS rounding means:

 0

 0.001 - 0.099

 0.01 - 0.99

 1.0 - 9.9

 10.0 - 99.9

 100 - 999

 1010-999990

 

 

6)      Using GetAccess.exe, open the empty database file GetAccessHDB.mdb that has all the data in each table deleted.

7)      As an example we will add the Wunderground example above for Station ID = KLAS and Parameter Code = Dew PointF.  The Dew PointF associated datatype_id = 16.  Our first site_id will be area 1, station 1 or site_id = 101.  Our unique datatype_site_id = 16101.

8)      We are ready to update the table rsite with our new station data.  Click DB Tables.

 

 

Select rsite, and then check the Allow Edit check box click GO.  The blue * indicates the row is available for adding new data by typing in fields on that row.

.

 

 

 

Fill the fields as this example then click on the blue * and the data will be stored and a new edit line * added: (use alt-248 on num pad for ° degree symbol)

 

 

Let’s add the USGS station example for flow.  Site_id will be area 1, station 2 or 102.  Datatype_id for flow = 1.  The unique datatype_site_id is then 1102.  To add this data we can type at the blue * or we can copy a row from the ExampleRsite table using the Edit Menu button or right click on the selected and the Edit Menu will open.  For now just type in the new site data:

 

 

We now have 2 sites with the needed Station ID and Parameter Code for retrieving real-time data using GetRealtime.exe.  GetRealtime.exe stations list now needs revised with just these two new sites before data can be retrieved and stored.  Refer to the GetRealtime section below Adding sites to the GetRealtime station list.

 

Program Use

 

GetRealtime.exe

(retrieval and computations of real-time web data)

 

 

The hourly and daily data should be updated for the past 7 days using GetRealTime.exe so that the examples can be seen once you read this section on how to that.

 

Setting the Database Connection String:

 

If GetRealtime.exe cannot make a connection to the Access database the following message will appear.  Set the connection string to the GetAccessHDB.mdb database using the Connection button on the Select Station form (control button 6 below).

 

 

 

The connection string should look similar to this depending on the path where GetRealtime.exe was installed:

 

Driver={Microsoft Access Driver (*.mdb)};DBQ=

C:\Program Files (x86)\GetRealtime\GetAccess\GetAccessHDB.mdb;Uid=Admin;Pwd=Me;

 

Or better yet: 

Provider=Microsoft.Jet.OLEDB.4.0;Data Source=

C:\Program Files (x86)\GetRealtime\GetAccess\GetAccessHDB.mdb;User Id=admin;Password=;

 

 

For Excel, the connection string would look like this:

Driver={Microsoft Excel Driver (*.xls)};DBQ=

C:\Program Files (x86)\GetRealtime\GetAccess\GetAccessHDB2.xls

 

Those using Excel as their database should open the GetAccessHDB2.xls file with Excel and read the 'ReadMe' worksheet to learn more about using Excel as a database.

 

 

 

Description of the controls on the GetRealtime form:

 

1)      Days—Number of days to retrieve.  1 day means yesterday and today.  The unit values are retrieved and averaged for hourly and daily values.  One daily value for yesterday will be available for storage.

 

2)      Missing Values Allowed in Day—Percentage of missing unit values allowed in a day before a daily average will not be computed.

 

3)      Bad Value Check—Percent change in the unit values from one time step to the next before the value is raised as a possible error.  If in batch mode, the value will be listed in the GetRealtime_errorlog.txt file.  If in interactive mode, a message box will appear for how to handle it.

 

4)      DSID for 1 Station—DatatypeSite_ID, typically not used but can be entered for 1 station retrieval.  Selecting just the 1 station from the station list is easier.

 

5)      Station Name or range—Displays the selected station or range of stations from the station list if only 1 station or a range of stations is wanted.

 

6)      Select Station From List button—Displays the station list if only one station is to be retrieved.  For editing the station list see GetRealtime.exe Setup Section above.

 

7)      Write Dailys—Daily average computed from the retrieved unit values will be stored in the database.

 

8)      Write Hourly Avg—Hourly average computed from the retrieved unit values will be stored in the database.  The hourly value time stamp will be for the end of the averaging period.  For instance, six 10-minute unit values with the last value reported as 10:04 am will have the hourly time stamp of 10:00 am.

 

9)      Write Daily MaxMin—Daily Max and Mins of the retrieved unit values will be stored in the database.

 

10)    Write Unit Values—Retrieved units value will be stored in the database.

 

Time steps for the unit values retrieved vary:

 

Wunderground unit time step can range from1-minute, 5-minute, on up to 3 or 4 hour time steps.  5 and 10-minute time steps are the most common.  Airports at Wunderground usually have 30-minute and 1-hour time steps.

 

The USGS data are typically in 15-minute time steps, but can vary during events.

 

CalDEC data time steps are 60-minutes.

 

USCS Snotel data time steps  are 60-minutes.

 

USBR data can be requested as either 15-minute or 60-minute.  Most USBR meteorological data time steps are available only as 60-minute.

 

CIMS data time steps are 60-minutes.

 

11)    Write Unit Shifts—Each shift value and the equation number used will be stored in the database for checking and debugging computation equations if any.

 

12)    Overwrite Source 5—Each daily, hourly, unit value has a Source field value stored. 1=Wunderground, 2=USGS, 3=CalDEC, 4=USCS Snotel, 5=WRITE PROTECT, 6=USBR, 7=COE, 8=CIMIS.  The database field Source if set to 5 by editing of the database will protect the value from being over written.

 

13)    Bad Value Check—Check for possible bad values as defined by the Bad Value Check percentage change.

 

14)    Last Run's Storage--For runoff computations, the last run's starting basin storage space and rainfall sum will be used so that running a month or more is not needed. Now you can just start today if the runoff part of the hydrograph has reached zero. For a 12 hour unit graph's recession this may take a day or 2.

 

15)    Scheduled Batch—GetRealtime.exe can be run in the background at user defined intervals.  The Wunderground stations will always be retrieved at the interval set.  The other sources only update their data each hour so GetRealtime.exe will only query their stations at half hour intervals.  The first time GetRealtime.exe is run in batch mode all stations will be retrieved for the number of days set.  After the first time, only values for the current day will be retrieved.

 

Getrealtime.exe can also be ran in Batch Mode from a Desktop shortcut with the shortcut properties Target set as “…\GetRealtime.exe batch”.  Also GetRealtime.exe can be run from the Windows Control Panel’s Scheduled Tasks in the same manner, be sure to include “batch”.

 

16)    Start Realtime Retrieval button—Start retrieving data.  If the 1 station retrieval box is not checked, then all of the stations on the station list will be retrieved.

 

17)  Historical button—Retrieve unit values or daily values based on a user set date range instead of the number of days to retrieve.  The periods for the availability of historical unit values can vary for each source and each station.  The Historical Button can be toggled on and off by repeated pressing.

 

It's important that the MS Access HDB database be routinely 'Compacted and Repaired' to speed up downloads. The size of the MS Access database doesn't seem to matter as much as how often data is being updated. I retrieve about 100 stations every 1/2 hour with 5 minute unit values (Nexrad). I find the down load time will double in about 2 to 3 days so I 'Compact and Repair' using the GetAccess.exe 'Connection' button every day or two... so be smart like me. Look at your 'GetRealtime_errlog.txt' file to see how your download times vary over time.  GetRealtime in Scheduled Batch has a checkbox for 'HDB Compact' which will compact and repair each day at midnight while running in batch mode.

 

My MS Access database is 200 MB's with about one year's worth of data. Wikipedia says MS Access can handle 1 to 2 GB's. We shall see. One option to reduce the database size is save the database file every year and then delete all the unit values and keep going so historical unit values would be available if ever really wanted in separate files. 

 


Adding Sites to the GetRealtime Station List:

 

After you have deleted the example data from all the data base tables, then you need to delete all the sites on the GetRealtime station list as follows:

 

1)      Make a backup copy of the file GetRealtime_setup.txt for later reference. You will need it for reference!

2)      Open GetRealtime.exe and click on the Select Station from List button:

 

 

Click the Edit List button.  Right mouse click on the left column and the edit menu will appear.  Click Delete Record and the selected row will be deleted.  Continue row by row deleting all the records.  The last row cannot be deleted, edit the last row then right mouse click the left column and Add New Record.  Edit the fields like this and click on Save:

 

Optionally, the setup file GetRealtime_setup.txt in the GetRealtime directory can be edited using Notepad provided with Windows in Start/All Programs/Accessories.

 

 

The stations on the list are now ready to be retrieved.

 


Make sure the DSID for 1 Station check box is not checked.  Click the Start Realtime Retrieval to test the changes to the station list and Access database changes.

 

 

The daily average values for yesterday were successfully computed and displayed indicating our changes to the GetRealtime Station List and the Access database were successfully made. Woohoo!!!

 

If not, recheck the values in the database table Rsite and retry.  If still unsuccessful check the values on the Station List.  The datatype_site_id must match in both the rsite table and station list and the parameter_code and station_id in table Rsite must be valid for that web source.

 

Change the Days text box to 7 days and repeat the retrieval so GetGraphs can graph the new data.  The final step is to edit the GetGraphs setup to display our 2 new sites.  If you open GetGraphs now you will see only the Web pages will have data.

 

See Adding Sites to the GetGraphs Setup below.

 

 

'Tools' command on the station list window:

 

The text file GetRealtime_tools.txt can be created for a list of commands that can be shelled. For example the file GetRealtime_tools.txt might read:

 

Notepad.exe c:/vbget/getxmore/getrealwunder/GetRealtime_errlog.txt

Notepad.exe c:\vbget\getxmore\GetGraphs\GetGraphs_setup.txt

C:\Program Files (x86)\HEC\HEC-RAS\5.0.3\Ras.exe

C:\Program Files (x86)\Microsoft Office\OFFICE11\Excel c:\vbget\runoffIn.txt

 

When each is selected, the program line is shelled. Shelling a program simply means execute the command line which probably entails opening a new window. The new window remains open until the user closes it or the command self terminates.

 

  

Computation Examples on the GetRealtime Station List:

 

With the 5 equations discussed here along with routing and conditional IF line execution you can imagine how just about any irrigation and reservoir rules and demand scheduling can be handled so Grand Coulee is on the table. Whether it be a short term operations model or long term model with Monte Carlo's or mix and match.

 

Note that there are TWO ways to go about computing values for storing in your HDB (hydrologic database) database:

  

1) All parameters are being retrieved at one time and the parameters will not be saved to the HDB database (but of course could be separately). The datatype_site_id would probably have the same site_id as the parameters if they were stored separately.

  

2) All parameters are already stored in the HDB database and the Station_ID would be the more general method COMPUTE (or COMPUTE-UNIT), COMPUTE-hour, or COMPUTE-Day. This allows for computation from mixed sites and datatype_site_ids into the computation. COMPUTE-2Hour can be used for Gage/Radar ratios. Also COMPUTE-xUnit can be used for input time steps other than 5-minutes. Likewise, COMUTE-xHour and COMPUTE-xDay.

  

Special Station_ID's for computations discussed below:

1) COMPUTE

2) ROUTE

3) SNOWMELT

4) FILE

5) RUN

6) MRUN

7) NEXRAD

8) COMPUTE-Get

9) COMPUTE-Put

10) SHELL

11) FORECAST

12) LOOKUP

13) IF

  

COMPUTE and some data_type_id's  also have special built in computations in the HDB table rsite parameter_code:

1) ETshort  (data_type_id 27)

2) ETtall   (data_type_id 27)

3) Solar  (data_type_id 29)

4) Solar long  (same datatype_id 29)

5) Ratio (data_type_id 31) 31 is not a special computation ID. It has no rounding but what is used with the keyword here "Ratio". You can still use datatype_id 31 how you choose.

6) and data_type_id  30 will automatically compute runoff from the same site_id using datatype 10 for precip, or it will use another DSID (10xxx) of your choice for rainfall if you put a DSID in the parameter_code cell.

 

Now Wizards are available to give you a better idea of how to setup your data_types and datatype_site_id's so be sure to use the setup Wizards.  If there is no Wizard for what you want to do, look at the example setup.txt list that came with the installation and the GetAccess table 'ExampleRsite'.

  

All these special cases are discussed below in separate sections.

  

The Station List GetRealtime_setup.txt has fields for adding up to 5 sets of formula expressions, shifts, and bases.  The base of the equation is a value or expression that is evaluated to determine which formula expression will be used.  The shift is available to make changing the equation simpler by including the variable Shift in the equation.

 

The P1, P2, P3… variables represent the retrieved parameters in the order they were entered in the GetAccess database table Rsite Parameter Code field.

 

The variable D represents the DateTime value retrieved with the parameters.

 

The function Julian(D) returns the julian integer day of the year.

 

The variable N returns the number of parameters with data (missing parameter values are set to zero if N is included in your formula) for that time step.  The variable N is useful for logging database stats or more importantly for computing averages when some gages are missing.

 

Example of parameter averages with and without missing values:

(P1+P2+P3)/3 returns average or missing if any P is missing.

(P1+P2+P3)/N returns average of non-missing P's.

 

To fill the last computed value for the future forecast days add 'FILL' to the 'shift1' column like: COMPUTE-Hour; -32307; Fraction; AAC Drop 2 % Ratio Max Flow; 0; FILL; P1/6400

 

Your formulas (P1^2+P2/P3) are evaluated using VBScript .

VBscript math functions:   http://msdn.microsoft.com/en-us/library/72ca82az.aspx

VBscript other functions:   http://msdn.microsoft.com/en-us/library/3ca8tfek.aspx

 

To replace a runoff forecast with USGS gage history one could use D and vbscript Now() like this:

COMPUTE; -1299; RUNOFF; Replace Runoff With USGS History; D=Now(); 0; P1; D>=Now(); 0; P2

 

Where P1 is runoff and P2 is USGS as the table Rsite parameter_codes.

 

 

Example of converting Gage Height to Elevation:

05054000; -3710; Elevation; Red River Of The North at Fargo; 0; 0; P1+861.8

 

Notice the Datatype_Site_ID -3710 is a negative integer to keep calculated values separate from reported values and to point out it was computed.

 

In the above example the Base1=0 will convert any Gage Height below 0 as 0.

 

The Shift1=0 is not used.

 

The Formula1 P1+861.8 adds 861.8 to the retrieved Gage Height and is stored in the database.  The Gage Height retrieved is discarded.

 

Example of converting Temperature C to Temperature F:

09380000; -7510; Temperature; Colorado R at Lees Ferry; 0; 0; 32+P1*9/5

 

Base1=0 will convert any retrieved Temp C below 0 to 0 F.  Probably not a good example but ok for this stations water temperature.

 

Shift1 not used.

 

Formula1 32+P1*9/5 converts the retrieved Temp C to  Temp F.

 

Example of converting Temperature C to Temperature F with Shift:

05054000; -7710; Temperature; Red River Of The North at Fargo; -20; 0.5; 32+P1*9/5+shift

Base1=0 will convert any retrieved Temp C below -20 to 0 F.

 

Shift1=0.5

 

Formula1 32+P1*9/5+shift converts the retrieved Temp C to  Temp F and increases the value by 0.5 degrees F.

Example of converting Temperature & Dissolved Oxygen to Percent Saturation:

14211010; -13910; Dissolved Oxygen; Clackamas River near Oregon City, Or; 0; 1.015; 100*P2/(shift*(14.55-0.3940*P1+0.00718*P1^2-0.0000611*P1^3))

 

Base1=0 will convert any P1=TempC retrieved value below 0C  to 0% saturation.

 

Shift=1.015 is the adjustment for station elevation and is computed as Shift= 0.0000364*ELV+0.000000000563*ELV^2

 

Formula1=100*P2/(shift*(14.55-0.3940*P1+0.00718*P1^2-0.0000611*P1^3))

 

Where P1 and P2 represent the Parameter Codes in order in the database Rsite table of:

10,300 where P1=10= water temperature °C and P2=300=D.O. mg/l.

 

The dissolved oxygen saturation formula example above is a curve fit to USGS Weiss table at 760 mmHg or sea level.

 

Example of converting Gage Height to Flow:

(note: you could also use 'Lookup' for ratings.)

 

09419800; -1013; Flow; Las Vegas Wash above Lake Mead;

3.9; .05; 401.8*(P1+shift-3.9)^1.502; D<cdate("2008-11-26"); int(100*(.05+(.10-.05)*(cdate("2008-11-26")-int(D))/(cdate("2008-11-26")-cdate("2008-11-01")))+.5)/100; 401.8*(P1+shift-3.9)^1.502; 4.7; 0.10-0.10*(P1-4.7)/(4.8-4.7); 401.8*(P1+shift-3.9)^1.502; 4.8; +0.00; 602.2*(P1+shift-3.9)^1.502

 

This conversion uses 4 sets of bases, shifts, and formulas.  Beginning with set 4:

 

Set 4=  4.8; +0.00; 602.2*(P1+shift-3.9)^1.502

 

Base4=4.8, should the Gage Height P1 be greater than 4.8 ft the associated shift and formula will be used.

 

Set 3=  4.7; 0.10-0.10*(P1-4.7)/(4.8-4.7); 401.8*(P1+shift-3.9)^1.502;

 

Base3=4.7 feet.  Should P1 gage height be >=4.7 ft and <4.8 ft the associated shift and formula will be used.

 

Shift3= 0.10-0.10*(P1-4.7)/(4.8-4.7) is shifting with stage where the shift will evaluate to 0.10 ft at P1=4.7 ft to 0 ft at P1=4.8 ft.

 

Formula3=401.8*(P1+shift-3.9)^1.502

 


Set 2= D<cdate("2008-11-26"); int(100*(.05+(.10-.05)*(cdate("2008-11-26")-int(D))/(cdate("2008-11-26")-cdate("2008-11-01")))+.5)/100; 401.8*(P1+shift-3.9)^1.502;

 

Base2= D<cdate("2008-11-26").  Should P1 gage height be <4.7 ft and then Base 2 is evaluated.  If the retrieved DateTime<2008-11-26 00:00 AM then the associated shift and formula will be used.

 

Shift2=  int(100*(.05+(.10-.05)*(cdate("2008-11-26")-int(D))/(cdate("2008-11-26")-cdate("2008-11-01")))+.5)/100 is shifting with time where the shift will evaluate to 0.10 ft on D=Nov 1 decrease to 0.05 ft on D= Nov 26.

 

Formula2=401.8*(P1+shift-3.9)^1.502

 

 

Set1=  3.9; .05; 401.8*(P1+shift-3.9)^1.502;

 

Base1=3.9 should the Gage Height P1 be less than 4.7 ft and D is greater than Nov 26 then the associated shift and formula will be used.  If the Gage Height P1 is less than 3.9 ft then the computed flow will be 0 cfs.

 

Shift1= .05

 

Formula1= 401.8*(P1+shift-3.9)^1.502

 

 

Note:  Turn ON the “Write Unit Shifts” on the GetRealtime retrieval to see what shifts are being computed and what formula is being used in the computation.

 

 

Example of converting Gate Opening to Flow:

 

09429000; -1310; Flow; Palo Verde Canal near Blythe, Ca; 81.0; 0.00; 3.17*70*(P1+shift-40.15)^1.5; P2>81; -0.00; 3.17*70*(P1+shift-P2)^1.5; P1>(P3+77); 0; 0.6214*70*P3*(64.32*(P1+shift-P2))^.51

 

This computation uses 3 sets of bases, shifts, and formulas depending on the flow type as submerged flow, submerged weir, free flowing weir.  Beginning with set 3:

 

Set 3= P1>(P3+77); 0; 0.6214*70*P3*(64.32*(P1+shift-P2))^.51

 

Base3= P1>(P3+77) where P1=upstream GH, P3=gate opening.  If Base3 evaluates true, then submerged orifice flow will be used.

 

Shift3=0

Formula3=0.6214*70*P3*(64.32*(P1+shift-P2))^.51 submerged flow equation.

Set2= P2>81; -0.00; 3.17*70*(P1+shift-P2)^1.5;

 

Base2= P2>81 where P2=downstream GH and 81 is the elev of the bottom of the gate for submerged weir flow.

 

Shift2=-0.00.

 

Formula2=3.17*70*(P1+shift-P2)^1.5 where P1= upstream GH and P2 = downstream GH.

 

Set1= 81.0; 0.00; 3.17*70*(P1+shift-40.15)^1.5

 

Base1=81.0 or if P1>81.0 then use free weir flow formula1 else zero flow.

 

Shift1= 0.00

 

Formula1= 3.17*70*(P1+shift-40.15)^1.5 where P1=upstream GH.

 

 

Example of diurnal Side Inflow from a Sewer Plant:

  

Compute-unit; -24332; Inflow; Bham Sewer Plant; 0; 0; 30-27.11659*Hour(D)/24+78.76657*(Hour(D)/24)^2-51.46933*(Hour(D)/24)^3

OR

Compute-unit; -24332; Inflow; Bham Sewer Plant; 0; 0; 30-27.11659*(D-int(D))+78.76657*(D-int(D))^2-51.46933*(D-int(D))^3

 

Here you assume a constant daily average 30 cfs and then vary it +- 3 cfs. The rsite table needs any unit value dsid put in 'parameter_code' and is not used... but you could and change the constant 30 cfs to who knows what.

 

 

Example of a pumped waste ON/OFF for high flow dilution:

 

COMPUTE-unit; 24326; Inflow; Acme Pump Waste; 0; 0; 0; P1>50; 0; 3.5; D>cdate("2016-03-19 13:00"); 0; 0

  

The datatype_ID = 24 for inflow and 3 sets of forumlas are used. If P1 flow greater than 50 cfs then turn on 3.5 cfs pump. If the date is after March 19 at 13:00 then turn off pump at any flow.

 

 

Soil Flood Potential Computation:

  

Because the floods can occurr with comparatively little rainfall on wet soils I started wondering about CN ranges and what fraction of rainfall would runoff as the CN went up and down...

From SCS CN method Q=(P-.2S)^2/(P+.8S)

 

For a GetRealtime setup:

COMPUTE-day; -6303; Soil Runoff Fraction; Farley Soil Flood Potential; 0; 0; 0; P2>0.2*P1; 0; 100*(P2-0.2*P1)^2/(P2+0.8*P1)/P2

 

P1 is S as DSID 4xxx and P2 is P as DSID -4xxx and are automatically saved at the end of each day for each runoff comp. You could use COMPUTE-UNIT if you have set 'Write unit CN P,S' on your basin runoff setup for unit values of P and S to be written (also rain/melt Excess in/hr as 3xxx is written).

 

Table Rsite would be:

303; -6303; Farley Gage; 4; precip fraction; %; percent; 4303,-4303; COMPUTE; 3; AL; 0;

 

The idea is to be on your toes with wet soils and just what are wet soils. The forecast rain will be wrong too so don't rely on what the runoff says when your soils are wet.


Example of Computing Evapotranspiration Reference ET with out Solar:

 

Computation of reference ET is done internally by GetRealtime using methods recommended by the ASCE standard Penman Monteith at an hourly time step or your unit time-step.  Both the tall (alfalfa) and short (grass) computations can be made.  If you are interested in the ASCE methods, the following url will get you started:

  

The ET coefficients can now be checked and entered by using the 'Wizards' button on the Select Station List. Just click on your station, if present, and then click 'Wizards'. You will see a complete breakout of all your coefficients for editing.

 

http://www.kimberly.uidaho.edu/water/asceewri/ASCE_Standardized_Ref_ET_Eqn_Phoenix2000.pdf

 

The solar radiation used in the PM equations is computed internally using regressions of hourly Solar Radiation versus Temperature, Humidity, Zenith Angle, and Elevation in different regions of California. The hourly regressions for a year have R^2 of 0.92 and a standard error of 160 Langley’s/day for the hourly time step.  At the time I didn't know ASCE had a standard method for missing solar too, oh well, they're regressions. Update 5/1/2014--The solar computation has been refined for humidity above 80% and I like it much better.  

 

MOVRC1; -27407; ET Grass; Bishop, Ca Wunderground; 0; 37.384,-118.422,-120,4183,0.75; P4 this P4 is no longer used after Feb 8, 2022

 

Base1=0 is not used.

 

Shift1= 37.384,-118.422,-120,4183,0.75 is made up of 5 parameters separated with a comma that describe the site and are:

1)      Site Latitude=37.384

2)      Site Longitude=-118.422 and is negative for longitude west.

3)      Standard Time Meridian=-120 and is negative for longitude west.  Pacific Time=-120, Mountain Time=-105, Central Time=-90, Eastern Time=-75.

4)      Site Elevation in feet= 4183 feet.

5)      Windspeed adjustment=0.75.  Windspeed used in the PM computations are to be at the 2 meter height.  Standard weather stations make their Windspeed measurements at the 10 meter height so the adjustment for 10 to 2 meter is 0.75.  Wunderground weather stations at airports and Madis stations should use the 0.75 adjustment.  An airport has a Station_ID beginning with K and is 4 letters long.  Madis stations begin with the letter M….  All other stations should use the windspeed adjustment = 1.  The difference between using 1 or 0.75 will normally result in about a 5% difference in ET.

 

Formula1= P4.  The Parameter_Codes in the database table Rsite is entered in this order:  Use P1 if formula wanted like an adjust P1*1.2.

TemperatureF,Humidity,WindSpeedMPH, Clouds, ETshort

 

where 'clouds' is optional and is dsid= 26xxx from NWS 7-day forecast cloud-amount.

 

ETshort is used to compute a grass ETo, ETtall is used to compute an alfalfa ETo.

 

 

Example of Computing Evapotranspiration Reference ET with Solar Radiation:

 

If the weather station reports reliable solar radiation then the reference ET computation can be improved on low solar radiation days by using the reported solar radiation as follows:

 

MOVRC1; 27407; ET Grass; Bishop, Ca Wunderground; 0; 37.384,-118.422,-120

,4183,0.75;  P5 this P5 is no longer used after Feb 8, 2022 

 

…note positive DSID of 27407 to distinguish if from the computation at this site with out solar radiation above.

 

The Shift1 and Base1 are same as above without solar radiation example.

 

Formula1= P5 (different than P4).  Use P1 if formula wanted like an adjust P1*1.2.

 

 TemperatureF,Humidity,WindSpeedMPH,SolarRadiationWatts/m^2,ETshort

 

DSID's would be 17xxx,18xxx,28xxx,26xxx or 29xxx, ETshort

where 26xxx clould cover will compute solar better than blank.

 

If reported solar is missing or for a forecast period then solar is computed for the missing values. 

 

 

Example of Computing Solar Radiation (for Longwave Radiation see SnowMelt below):

 

 Solar radiation is computed internally using regressions of hourly Solar Radiation versus Temperature, Humidity, Zenith Angle, and Elevation in different regions of California.  The hourly regressions for a year have R^2 of 0.92 and a standard error of 160 Langley’s/day for the hourly time step.

 

The Solar coefficients can now be checked and entered by using the 'Wizards' button on the Select Station List. Just click on your station, if present, and then click 'Wizards'. You will see a complete breakout of all your coefficients for editing.

 

KCAHELEN4; -29408; Solar Radiation; Helendale, Ca Wunderground; 0; 34.775,-117.330,-120,2470;  P3 this P3 is no longer used after Feb 8, 2022

 

Base1=0 is not used.

 

Shift1=34.775,-117.330,-120,2470 is made up of 4 parameters separated with a comma that describe the site and are:

1)      Site Latitude=34.775.

2)      Site Longitude=-117.330 and is negative for longitude west.

3)      Standard Time Meridian=-120 and is negative for longitude west.  Pacific Time=-120, Mountain Time=-105, Central Time=-90, Eastern Time=-75.

4)      Site Elevation in feet= 2470 feet.

 

Formula1= P3 no longer used.  Use P1 if formula wanted like an adjust P1*1.2.

 

 

The Parameter_Codes in the database table Rsite is entered in this order:

TemperatureF,Humidity,Solar

 

DSID's would be 17xxx,18xxx,26xxx,Solar

where 26xxx clould cover is optional but will compute solar better than with out.

 

Also, for HDB computation for Solar, Long Radiation, and Evapotranspiration that require special internal GetRealtime subroutines here is an example GetReatime_setup.txt for them:

 

 

  

Look at the GetAccess table 'exampleRsite' for examples of setups for these computations from the database.  Note the formula Px is not needed unless your were transforming the output with P1.

 

Recommended Solar, Long Radiation, and Evapotranspiration rsite parameter_codes:

Solar>> 17xxx,18xxx,26xxx,Solar

Long>> 17xxx,18xxx,28xxx,26xxx,Solar long

ET>>    17xxx,18xxx,28xxx,26xxx or 29xxx, ET short or ET tall

 

The dsid 26xxx is cloud cover and comes from the NWS 7-day forecast of cloud-amount. No history is available so for history the old forecasted values are used.  26xxx and 29xxx can be omitted but not recommended, either 26 or 29 but not both.  Cloud-amount is not a valid parameter if you use a solar input and visa-versa.

  


Computation of Wind Direction:

 

Wind direction is retrieved as either text directions such as NNW or in degrees such as 270.  Both are converted internally to a value between 0 to 4 step 0.25 where 0 and 4 both equal direction North, 1=East, 2=South, 3=West, 0.25=NNE, 0.5=NE, 0.75=ENE etc.

 

Hourly averages and daily averages are computed from the retrieved vectors and weighted with the Wind Speed if available as:

 

Avg Wind Direction = ATAN2(sum cos*speed, sum sin*speed)

 

Computation from Database Values (version 2.2.2):

 

Update 8/20/2012: GetRealtime can get data from other databases in real-time along with your other real-time data. This is done with the station_ID 'COMPUTE-GET' . A file name is entered in the GetAccess parameter code that contains the other database connection string and SQL statement.

 

GetRealtime_setup.txt line:

COMPUTE-Get; 10035; Rainfall; Las Vegas Rainfall from other database; 0

 

GetAccess 'rsite' table line:

 35; 10035; Las Vegas Creek above Nowhere, NV; 10; rainfall; inches; inches; GetHourRainVegasCreek.txt; COMPUTE-GET; 1; NV; 0

 

A filename is entered in the 'rsite' parameter_code that contains the other database connection string and SQL statement. The 'COMPUTE-GET' control file GetHourRainVegasCreek.txt would like this:

 

 ---------GetHourRainVegasCreek.txt--------------

 

Las Vegas Creek rainfall from my other database

Put table: rhour

Get database_connection: Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:\VBGET\GetRealtime\GetAccessHDB2.mdb;User Id=admin;Password=;

Get sql: SELECT date_time, value FROM rhour WHERE datatype_site_id=1024 AND date_time BETWEEN #DAY1# AND #DAY2# ORDER BY date_time;

END

 

Your Notes (in control file),

put table is the GetRealtime's GetAccess table rday, rhour, or runit.

database_connection must be a an ODBC database like MS Access or Excel.

Your sql string must return the date_time and value in that order. DAY1 and DAY2 are place holders that are replaced when run.

You can retrieve and compute new values at the same time but I wouldn't because you may want to use GetAccess and GetGraphs for display.

 

------------------------------------------------

   

Computations from database values retrieved and stored can be performed in real-time by using the Station ID = COMPUTE-Unit, COMPUTE-Hour, or COMPUTE-Day. For example, to average 3 Wunderground rainfall stations the GetRealtime_setup.txt line could read:

 

COMPUTE-Hour; 10035; Rainfall; Average Las Vegas Rainfall; 0; 0; (P1+P2+P3)/3

Remember to place the COMPUTE setup line BELOW the stations you will be computing values from.

 

Hour was used here because Wunderground stations rarely have the same time steps in the unit values table. The GetAccess Rsite table could look like this where the Station_ID = COMPUTE and the Parameter_Code would have the Datatype_Site_Id's to be the P1, P2, and P3 values in your computation, 10030,10031,10032.

 

If you would like to Lag a parameter so many minutes then the parameter in the table rsite could be 4044Lag60. For example computation of inflow-outflow with storage could be 1043,1044,4044,4044LAG60. And the formula in the station list could be P1-P2-(P3-P4)*12.  Update Feb 28, 2021:  You could use Lag to convert total rainfall to increments but now to convert total rainfall datatype_id=9 to incremental rainfall use datatype_site_id=10xxx with datatype_id=9.

   

Likewise, to compute runoff from rainfall use Station_ID= COMPUTE-Unit like this:

COMPUTE-Unit; 30755; Runoff; Prairie Creek, Dallas, Tx; 0; 2,2.5,0.15,0.2,9.03; P1

The database table RSITE would have the Parameter_Code= 10753 and Station_Id= COMPUTE.

 

GetRealtime 2.4.4 allows multiple computations and storage on the same DSID by appending 'Run-1-1-' to the front of the Station_ID, ie 'Run-2-5-Compute-unit' would be stored in the 'm' tables under run_ID=2 and trace=5. See Multiple Scenarios at the bottom of this page.

 

NEW!!! Example of Rainfall-Runoff Computations (SEE END OF THIS PAGE):
 


Program Use

 

GetGraphs.exe

(displaying the real-time data and web screens)

 

 

Setting the Database Connection String:

 

If GetGraphs.exe cannot make a connection to the Access database the following message will appear:

 

 

Notepad will automatically be loaded with the setup file GetGraphs_setup.txt.

Update Mar 14, 2014--For easier page insertions GetGraphs.exe 2.2.4 has removed the leading page numbers in the GetGraphs_setup.txt file as shown below:

 

The Database Connection String should look similar to this depending on the path where GetAccessHDB.mdb file was installed:

 

Provider=Microsoft.Jet.OLEDB.4.0;Data Source=

C:\Program Files (x86)\GetRealtime\GetAccess\GetAccessHDB.mdb;User Id=admin;Password=;

 

For Excel, the connection string would look like this:

Driver={Microsoft Excel Driver (*.xls)};DBQ=

C:\Program Files (x86)\GetRealtime\GetAccess\GetAccessHDB2.xls

 

Those using Excel as their database should open the GetAccessHDB2.xls file with Excel and read the 'ReadMe' worksheet to learn more about using Excel as a database.

 

 

After connecting and loading the real-time data and web screens, then left mouse click on any of the graphs to page through the 15 pages in the provided setup.  Right mouse click on a graph or web screen to bring up a menu for changing the appearance and setting properties of the graphs.

 

Most of the menu choices will be self evident. Watch Levels and Setup are described below.

 

The Watch Level example was used to set the Humidity, % Saturation levels that should they be exceeded in the past 24 hours the graph title will start blinking.

 

The setup menu is used to add and delete pages, graphs, and web screens.

The Datatype_Site_ID for a graph can be scrolled though for any of the data in the GetAccess database.  The Datatype_Site_ID should be left blank when adding a web screen.  A DSID of -1 will be added automatically if blank and a web screen address is assumed for the Station Name.

 

Changes made to the setup will not be made permanent until Save all to GetGraphs_setup.txt is selected or prompted when quitting.  To quit, simply close the current page screen.

 

The 'Setup File' button will open the GetGraphs_setup.txt file for direct editing. The 'Access' button will open GetAccess.exe for data edits when needed.  With GetGraphs version 4.1 notes or winter setup lines can be kept below the END statement.

 


Adding Sites to the GetGraphs Setup:

 

If  you have deleted and added the two new data sites to the GetAccess data base and GetRealtime Station List then you are ready to delete all the example graphs and web screens in the GetGraphs setup file GetGraphs_setup.txt.  Make a copy of the GetGraphs_setup.txt file for reference if needed later.

 

Use the right mouse click on a screen to bring up a menu and select Setup, then select Delete Current Page and click Ok.

 

 

Continue deleting all pages.

 

Then use the Setup menu Add to Current Page to add the two new sites that were added to the GetAccess database and GetRealtime Station List examples above.

 

Use the Setup menu Save all to GetGraphs_setup.txt to save the setup for the 2 new sites.

 

Now that you are familiar with the steps involved in adding new sites the 3 programs GetAcess, GetRealtime, and GetGraphs you may repeat the process with your sites of interest.

 

 

Update 3/8/2012 GetGraphs 2.2.1 can now add a second data series to a graph.

Add your second series dsid like this to your GetGraphs_setup.txt line:

  

   4 ;-30311,-1313 ;Runoff;Diamond Cr Can, AZ.....etc.

  

Note that the two dsid's are comma separated, not semicolon. For this example the 2nd series is -1313 and will plot in violet. You have to use Notepad to add it because GetGraphs cannot.  You can now add a 3rd series dsid but then you cannot also use RUN below.

 

 

Add M-table Trace: And remember you can use Run-FORECAST-NWS to save forecasts to the 'M' tables for post event analysis of where you went wrong and use GetGraph's menu 'Add Trace' to compare. To view the previous RUN on GetGraphs startup automatically, add ',RUN' to your GetGraphs_setup.txt line for that DSID:

 

 4 ;-11301 ,-10301 , Run ;Rainfall; Sub1 Adj & UnAdj & Last Forecast; 0 ; .....

 

This means plot the last run for yesterday from Mtable values for dsid= -11301.

 4 ;-11301 ,-10301 , Run-3 ;Rainfall; Sub1 Adj & UnAdj & Last Forecast; 0 ; .....

 

This means plot the trace 3 runs ago from Mtable values for dsid= -11301.

 

 

Update 6/22/2015: Use the UNITS_OFF on the GetGraphs_setup.txt page line to turn plot HOURLY values on that page. This only matters if "Plot Unit Values= True". This is handy for plotting hourly rainfall and unit value flows. The %MAE'w will then be able to be computed with irratic rain gage time steps.

 

And if you don't want to FTP a particular page use FTP_OFF on the GetGraphs_setup.txt page line.

 

Update 7/4/2015: To display flooding color bands you can use a '4' in the frequency GetGraphs_setup.txt column followed by minor, moderate, and major flood stages:

 

1 ;-2329,2609 ;Stage;Village Cr at 24th St; 0 ; 2 ; 0 ; 9352151 ; 15724527 ; 65280 ; 255 ;12.01;;0; 0 ; 4 ; 8 ; 10 ; 12

 

In this case I have included the high 'Watch Level' of 12.01', set the frequency column to 4 with 8', 10', and 12' as minor, moderate, and major flood stages. If the Watch Level had been set to 7 below the Minor Level then an 'Action' yellow color band would be displayed below the Minor Level.


Working with Web Screens:

 

Web Screens are added using the Setup Menu shown above.  The Datatype_Site_ID and Parameter Name should be left blank and the Web Screen URL address entered in the Station Name text box. 

 

Web Screen Example:

http://waterdata.usgs.gov/nwis/rt

 

Enter the above example URL into your web browser and this screen is will appear:

 

The above URL or the same URL taken from the web browser address box could be used in the GetGraphs Setup Menu Station Name text box.

 

If just the US map picture is wanted, then the scroll bars in the GetGraphs display can be used to center the US map…. Or a better method is to get the URL of just the US Map gif by right mouse clicking the US map in your web browser.  A menu will appear and the Properties can be selected to display the URL Address of just the US map gif:

 

The gif URL is: http://waterdata.usgs.gov/nwisweb/icons/waterwatch/images/real/us/real.gif

 

 

Web Screens can also be files on disk such as pictures or text where the URL is the filename such as C:\mydata\myfile.jpg or C:\mydata\myfile.txt.  Update 12/30/2014: You can use the setup checkbox 'Resize' to fit images to the size of each webbrowser.

 

When working with web screens it is best NOT to turn on the Setup check box "Allow navigation for this site" if not needed. If it is on then you will need to use the right mouse click menu to turn the page of your web screen. You may also note some web content reacts differently to mouse clicks. When using the right mouse click be sure to take note if it appears. It is sometimes hard to notice in full screen mode.

 

If you want your web screens to be refreshed at regular intervals like ever 5 or 10 minutes you must have GetRealtime.exe running as a scheduled task downloading at least 1 data site such as a Wunderground temperature. You do not need to include the real-time data graph. Also, Auto Paging must be turned on.

 

Update 5/6/09—Getrealtime.exe version 1.0.3 has been updated to compute real-time rainfall-runoff from the Wunderground or the other supported rainfall real-time sources.  Below is an example of editing the HDB database table Rsite using GetAccess.exe.  The datatype_id for runoff is 30.

 

Wunderground rainfall gages may have erratic time steps (especially airports) and so caution should be used for time-steps less than 1 hour.

 

Quick Overview:  Here are the GetRealtime setup steps for one NEXRAD rainfall boundary (More info on GetNexradHelp):

1) Use GetMapArea.exe to digitize your BoundaryFile as N0Q image pixels (or use LatLongPixelsFromFile if you already have a Lat/Long boundary file.  Think about always digitizing in lat/long instead of pixels and using LatLongPixelsFromFile to generate the pixel file.  Lat/longs can provide boundary KML visualization on Google Earth.  Instead you might try  GoogleKML2Text.exe  to read Google Earth Pro created polygon KML files and convert to text boundary lat-longs for use by LatLongPixelsFromFile.exe but you can still use GetMapArea to load Google Earth with the EPA Waters KMZ units and catchments and use Google Earth's 'Add', 'Polygon' and save as KML.  With Google Earth Pro, if you click on the EPA Waters boundaries, you can save them to KML with out actually digitizing anything... whoo hoo!

NexradBoundaryESX10001Q.txt  (pixel's x,y)

2) Use GetNexrad.exe to convert the BoundaryFile pixels to a PointFile. Inspect the Point file (fill in the missing 1's if needed) and copy both files to your GetRealtime.exe directory.

NEXRADPOINTESX10001Q.txt   (0's and 1's)

3) Add your basin's line to the GetRealtime_setup.txt using GetRealtime.exe.

NEXRAD-ESX; 10001; Rainfall; Village Creek above Nowhere, NV; 0

4) Add your basin's line to the GetAccess table 'rsite' using GetAccess.exe.

1; 10001; Village Creek above Nowhere, NV; 10; rainfall; inches; inches; N0Q-Ridge2; NEXRAD-ESX; 1; NV; 0

5) Run GetRealtime.exe and download the past 2 hours of N0Q radar images for the ESX radar.

Viola! You will see the 2 hours of 5-minute rainfall in your GetAccess database.  The Boundary Setup in more detail is described below here. For areas of sparse radar coverage, blockage, or winter overshoot you can try Iowa Mesonet's composite N0Q of all nearby radars which I term NCQ. You will have to convert your N0Q boundary's to NCQ using LatLongPixelsFromFile.exe or start from scratch. NCQ tends to return much higher reflectivity values than N0Q. Just use NCQ instead of N0Q in the examples above.

 

THE COOKBOOK:  Concise Steps to a GetRealtime Radar Rainfall Runoff Setup.

 

GetRealtime.exe uses the SCS Triangular Unit Hydrograph or Linear Reservoir Runoff Hydrograph or Clark synthedic or Dimensionless Unit Hydrographs from USBR Design of Small Dams. Rainfall loss methods are initial loss (or SCS Curve Number CN), constant loss, interception, and percent of basin impervious w/fraction connected. With the Triangle Unit Hydrograph you can custom fit the peak, interflow, and recession by adding 2 additional triangular unit graphs just by setting what fraction of the runoff belongs in each and what recession ratio of time to peak for each.  The Getrealtime.exe setup file is edited using GetReatime.exe as shown below and as follows

  

Update 9/17/2012--The runoff coefficients can now be checked and entered by using the 'Wizards' button on the Select Station List. Just click on your station, if present, and then click 'Wizards'. You will see a complete breakout of all your coefficients for editing.

 

Base1=0 optional base flow to add (new 6/22/2012)

 

Shift 1 contains 5 parameters separated by commas, in this case 1, 0.25, 0.13, 5, 99.3.

These values are as follows:

1.0 is the basin Lag time in hours.

0.25 is the Initial Loss in inches. (or 10-100 SCS Curve Number CN)

0.13 is the Constant Loss in inches/hour.

5 is the Percent of Basin Impervious.

99.3 is the Basin Area in square miles.  

(Optionally, the SCS trianguler unit graph recession ratio may be added and defaults to 1.67 if not included.)

(Also, the rate factor at which the initial loss or CN recovers may be added and defaults to 0.2 if not included.)

Formula1 contains the resulting runoff as P1.

 

 

 

If there is no rainfall the Initial Loss will begin being reset at the rate of 0.2*Constant Loss. In this case the Initial Loss will have returned to it's initial value of 0.25 after 9.6 hours.  Note: This simple idea has been expanded and improved with new updates below.

 

If there is rainfall the computed and stored runoff will be carried out to the peak discharge. GetGraphs.exe can then display the predicted future runoff hydrograph up to the peak flow that the rainfall up to the last retrieval has produced.  If the computed peak flow will be occurring on a future day like tomorrow because of a large lag time, then that future peak value will be shown as the midnight value of the current day. This way GetGraphs.exe will be able to at least plot what the future peak will be.  Update: GetGraphs allows setting 'Days' and 'Future Days' so you can view more of the future hydrogaph on larger basins.

 

For basins having significant indirect runoff and need better definition of the recession then you may optionally add 2 additional coefficients to describe the recession in addition to Tp/Tr ratio (1.67). Here is an example of a GetRealtime setup data line for an extended recession.

 

NEXRAD-DAX-hour; 30425; Runoff; Big Cr nr Groveland, Ca W/adj Rain; 60; 8,0.55,0.10,1.0,16.3,1.67,0.2,0.3,3; P1

 

Where:

-hour=optional rainfall time-step hour or unit, defaults to unit if omitted (new 6/28/2012)

Base1=60 and optional base flow to add (new 6/22/2012)

8=basin lag time, hours

0.55=initial abstraction, inches

0.10=constant loss rate, inches/hour

1.0=percent impervious, %

16.3=basin area, sq.miles

------optional Tr/Tp and recession values-----

1.67=Tr/Tp

0.2=initial loss recovery rate factor. Recovery rate will be 0.2 * constant loss rate

0.3=fraction of rain excess applied to recession and removed from peak

3=factor used to multiply Tr by

------- 2nd Optional Seepage factors for a base flow (NOT SHOWN)-----

0.2=fraction of rain excess applied to recession and removed from peak

10=factor used to multiply Tr by

 

As shown in the figure below, version 2.0.1 now has the optional recession Tp located at the same time of the peak triangle. I made this method up and I think it works great... besides I could not figure out what a gamma function is.  The optional recession can allow the recession to recede for days if needed. I don't know how true this is about interflow and baseflow. It's probably more like a quick and easy way to combine storage routing with the runoff unit graph computation so be aware if flow goes into the flood plain you may want to include a Modpul routing for that.

You may wish to use the free GetMapArea (More Stuff) to quickly evaluate the effects of  lag, losses, and percent impervious, especially if  your rainfall gage has a recording streamflow gage for calibration of the rainfall record.  With the new GetRealtime runoff setup wizard, it is even more easier view your parameters, make changes, then click and go to graphically compare effects on the hydrograph.  Calibrating actual runoff is most educational ... just find a USGS streamflow gage and use Wunderground's Wundermap to see if a rain gage is available for honing your runoff skills using GetRealtime or GetMapArea.

 

 

 

Video how to for determining triangular unit graph coefficients:

 

Simulate SCS Curve Number Loss and add coefficients to Setup file and Database :

 

Update 1/2/2013--A Linear Reservoir Runoff Hydrograph method has been added. So in addition to the 3-Triangle Unit Graph method, this LRH method allows shaping of the hydrograph also. The Triangle Unit Graph and this Linear Reservoir are the only 2 unit graphs that you can dramatically shape your runoff hydrograph without  changing peak lag time (or by selecting different DGF's below but just a bit).

 

Linear Reservoir Hydrograph Equation:

Q2 = Q1 exp { −K (T2 − T1) }  +   RainExcess  [ 1 − exp { −K (T2 − T1) } ]

 

where the K factor can be adjusted to shape your basin's response. This is not quite as versatile as my 3-T unit graph method but it requires no convolution of each excess increment and is quite simple to implement. The current time step runoff is based on the last runoff flow and rainfall excess and should allow erratic time steps even.  So you could easily do runoff in an Excel worksheet sans convolution.  From the equation above you can see it is two competing exponential curves working against each other. One is concave up and one is concave down. Who thinks up this stuff???

 

I have included a lag time offset so that you can still get the peak at the right time using your usual lag time method of choice but the K shape factor, not the lag time, determines the peak runoff and recession shape.  The shaping K-factor to mimic the SCS dimensionless unit-graph can be calculated as K = 1.129 * lag ^ -1.10391 and lag time offset factor as lagAdj = 0.592 + 0.0281 * lag. You can use GetMapArea.exe to compare it to many unit-graph types and be impressed at how well it does for shorter lag times. And because there is no convolution into the future, this LRH method can pick up right where is left off without having to start the simulation earlier if runoff is still occurring.  Boy, where was I when they passed out this method!

 

Use the Runoff Wizard for the setup. The Lag and K factor can also be 'What If'ed with the Wizard 'What If' button to see how it shapes the hydrograph... but I'm sticking to the 3-T unit graph myself.  The Tri-Linear Reservoir will also use the triangular unit graphs adjustment factors for combining 3 linear reservoirs.

 

Q = \frac{(P-I_a)^2}{{P-I_a}+S}

  

Update 7/5/2013--Several Dimensionless Unit Hydrographs from the USBR Design of Small Dams or their Flood Hydrology Manual have been added. For larger basins you may find you will need to construct your own dimensionless unit graphs to fit odd complex runoff patterns that even triangular unit graphs cannot handle. A dimensionless unit graph can easily be constructed from a USGS hydrograph. You just take the hydrograph as is or some 1 hourly or 2 hourly interpolation to get around 100 values, remove the base flow, and multiply the hydrograph by 484.36 divided by the sum of the hydrograph values in cfs.  The final DGF ordiantes COUNT at 50% volume cumulation 242.18 for the dgf unit duration as 100/count... not time. See my DGF files for the format. Actually if you did a DGF for any basin with a USGS gage, you wouldn't need my 3 triangle unit graphs. And remember that GetRealtime can compute the base flow to remove in it's wizard's runoff output text file. Everyone should try making their own DGF just to say they have.  And basins with lag times (peak rain to peak flow) greater than 33 hours (5.5*6=33) are getting pretty big and you should think about subdividing.  Forget the 33 hours lag limit if your basin has different interior regions of  odd contributing shapes and losses like urban areas then 6 hours is pretty risky so use your head.

 

 

Update 10/7/2014--Clark Unit Graph Method:

The Clark time-area histogram uses the synthetic time-area curve from COE's Hec-1 for an ellipse:

t < 0.5: area = 1.414 * t ^ 1.5

t>= 0.5: area = 1 - 1.414 * (1 - t) ^ 1.5

The Clark time of concentration Tc is your Lag time in hours. Your Clark routing factor C (default 0.75) is used to provide a Muskingum routing travel time K as K=C*Tc. The Clark Unit Graph has Tc/Unit_Duration steps of t=0 to 1 above.

 

Excess rain is convoluted with the unit graph and then the current runoff value is routed with the Muskingum method using K with X=0. This Clark guy was pretty damn clever. All you provide is 2 inputs, Lag time and C, thus providing your hydrograph timing of the peak and shape of the recession. What a sweet deal!

 

I thought this was such a sweet deal I thought I could affect hopefully the peak shape or at least the recession beyond just lagging. So I tried maintaining a time-area histogram sum of 1 but varying the 1.5 power above from 0.5 to 2 with power=1 a rectangular basin and 0.5 a figure 8. Not a lot of difference. So then I tried taking the power=1.5 unit graph and resorting it for peak at 1/3 and 2/3 points. Not much difference again. So then I tried a 1.5 power from 0 all the way to 1 with not much difference, so then 1 to zero. Uh oh, that was a mistake because half the runoff vanished. So in conclusion, if you want to reshape a unit graph either use 3 parallel linear reservoirs on smaller basins, 3 triangles on larger, or create you own dimensionless graph because there are no free lunches at Clark's diner. And always start with the SCS dimensionless graph as a start before wasting any more time with foolish ideas. Ok, maybe Clark with only somewhat different recessions with same peak is somewhat helpful. A really smart fellow would subdivide and route but that is blasphemous.  Update 11-7-2014, not so fast. I just put some basins together in the limestone area of northern Alabama with very low dry curve numbers of 40. The runoff has a lot of interflow and recession with smaller peaking. The Clark method came through perfectly but with a routing factor of 6 times the lag time. Hey, that Clark guy really was clever!!!

Q = \frac{(P-I_a)^2}{{P-I_a}+S}

 

 

Update 7/15/2012 V2.3.0 now supports SCS Curve Numbers CN 10-100 with CN recovery when it is not raining.  My new CN recovery method makes possible the use of this popular loss method and because it recovers during dry periods the CN method can now be used in GetRealtime continuous rainfall-runoff simulations.  Seems to work even better than the 'initial loss' method and easier to calibrate (the TR-55 urban manual is a must read also).  Although the CN loss method is an empirical relation between total rainfall and total runoff developed for pretty restrictive conditions, we will simply MAKE IT WORK by using additional loss factors for infiltration into S and return factors for recession flows for S recovery in equation 1 below.  S is treated as an actual physical property as inches of water stored in the soil. Both a soil water return and a groundwater reservoir return can be used.

 

The CN runoff  *EVENT*  equations for CN 10 to 100:

Q = \frac{(P-I_a)^2}{{P-I_a}+S}

For continuous simulation when raining, P is increased each time step rainfall amount until S saturation value reached (rare). S is increased by the infiltration amount until S saturation reached .  Infiltration is the time step  rainfall minus computed runoff.  Everything is in inches.

 

For continuous simulation when it is not raining, P is decreased each time step by the recovery rate based on soil factor and full ET. S recovers by the recovery rate but ET is reduced by S to soil storage range ratio so that saturated soil uses full ET and dry soil has zero ET.

 

Update 3/18/2015:  You can vary the Initial Abstraction Ia to adjust early runoff. Nothing more than using a different factor in the runoff equation is done, for example Ia=0.05: Q=(P-0.05*S)^2/(P-0.95*S)  In this case, 0.05 initial abstraction is being used instead of original SCS 0.2 with nothing changed about S. These differences are quickly lost after the initial abstraction period, unlike using equivalent CN's that really seem change things.

 

To simulate vegetation and canopy interception, infiltration (rain-excess; see below) does not begin until P>0.05*S for months April-November and 0.01*S for the other winter months. This helps on very low curve numbers used in forests. You may have to see it to believe it but for CNdry=40, Sdry=18", then 0.9" gets stuck up in the trees, moss, and leaf litter. Another 2.7" infiltrates before runoff begins. Only the 2.7" and above needs recovery due to it's much slower recovery rate.

 

Update 3/29/2013: An 'Interception Factor' for the growing months has been added so that the 0.05 factor used above can now be set by the user and defaults to 0.03 if not given and 1/3 the factor Dec-Mar (see Runoff Wizard). This interception is included in the CN initial 0.2*S abstraction. This interception can also be used with IL initial loss method but would effect the initial loss so you might want to set the factor to zero.  Here is my guess at  'Infiltration Factors' you might try: 0.05=Forested, 0.03=Grasslands, 0.01=Urban... or maybe they are all 0.05 due to the varying CN used???

  

And it would be nice for someone to come up with a Recovery Rate versus Curve Number but alas, that will have to be you.  I'm speculating here but this might help estimate soil recovery Factors.  Assume a 0.1 in/hr constant loss rate to go with these recovery factors to get you started.

 

Conductivity values suggested by Rawls, Brakensiek and Miller (1983)

Q = \frac{(P-I_a)^2}{{P-I_a}+S}

*Note: See tip #1 below and also K recession index method below.

 

Loam has a conductivity of 0.134 in/hr, which sounds good for a constant loss rate. Conductivity seems to imply that the recovery factor should be 1.0 to match the saturated constant loss rate which I don't think will work. I have been looking at recovery rate factors of 0.05 to 0.5 times the constant loss rate, but who knows. I thought somebody had figured this all out back in Horton's day. Ah ha! The soil can't be saturated if it is recovering! Geez, sometimes I amaze myself. ... but still begs a solution to the recovery factors.  As long as I am this far out on my limb, I propose the recovery factor to be related to suction head h times conductivity C, for loam is 3.5*0.134 or 0.469. I wanted the recovery rate to be 0.05 so my fudge factor is 0.1 as shown in the above table.  Case solved. I got your science right here! Oye Vey...

 

Some of my calibrated CN values and recoveries you might try for ungaged basins:

Q = \frac{(P-I_a)^2}{{P-I_a}+S}

Note: The Arizona basins were so huge that I would not rely on their values.

  

Update 7/21/2012: Giving this some more thought, the soil initial loss recovery is probably not so much a drainage problem but an Evapotranspiration problem. I still think the above recovery factors are ok for the constant loss (ETo=0.05 to 0.3 in/day versus conductivity 0.1 in/hr ???) shown but the recovery should be patterned as hourly ET. I looked at some summer and winter hourly ETo's I have been computing for a Wunderground Gage near Colby, KS. My ET's were standard ASCE Penman-Monteith reference ET for Tall alfalfa. What is important is the 24 hour distribution of ET. I selected July 16, 2012 for the hourly pattern, about typical. GetRealtime and GetMapArea now calculates the unit time-step recovery based on this distribution. Update, you can now use a nearby ETo record as shown below and is recommended.

  

Update 7/22/2012: After sleeping on it, I decided the best solution to the initial loss recovery between events was to go back to the constant recovery rate but add to this the mean daily Eto for Colby, KS with the July 16 hourly pattern below.  Here are the Eto hourly factors and the daily mean ETo calculation:

 ETo=0.17+0.12*SIN(0.017214192 * (284 + Date))

  

Q = \frac{(P-I_a)^2}{{P-I_a}+S}

 

7/1/2011 to 6/30/2012 Kansas 1 year total ET: 60 inches.

 

Hourly Eto Factors (0-23 hr):

"0.010397326,0.005198663,0.003713331,0.002970665,0,0.001485332,0.00037133,0

,0.00965466,0.027478648,0.048644634,0.070181953,0.088377274,0.10731526,0.11548459

,0.122539918,0.107686595,0.093575938,0.083921277,0.059784627,0.030449313,0.008169328

,0.001113999,0.000742666"

 

Monthly moFactors applied to your KcFactor (see Wizard below): "0.50,0.50,0.63,0.74,0.79,0.84,1.00,1.00,0.84,0.79,0.74,0.50"

 

Q = \frac{(P-I_a)^2}{{P-I_a}+S}

  

To review, there is no need for the user to input any information about ET. Hourly ET will be computed based on the above factors. If the user wishes to improve calibrations, then the first step might be to adjust the 0.8 KcFactor. If further ET refinement is needed then an ETo record can be calculated at a nearby Wundergage. Contrary to what I would have first thought, it probably wont be the wetter areas that need ET refinement but the desert regions where the Kc crop factor for desert plants can take what they get.  And do not forget that the soil recovery ET is again factored by current basin soil storage to simulate stressed conditions.

 

  

Update 7/24/2012 I think I got it:

BIG tip of the hat to "A Parsimonious Watershed Model" !!!

(After looking it up, I was into parsimonious when parsimonious wasn't cool.)

 

The idea here is to keep track of Storage for computing precip loss and runoff as:

 

Storage = Storage0 + Infiltration - ET - GroundwaterOutflow    (what a novel concept, sheesh!)

 

Where:

Infiltration = Rainfall - Runoff

and

GWoutflow=ConstantLoss * RecoveryFactor  parameter codes

and

ET and GWoutflow are scaled by:

Factor=(StorageDry - Storage)/(StorageDry - StorageWet)

...but the top soil (RainSum or P above) recovers at full ET and full GWoutflow.

GWoutflow (soil water) contributing to runoff can be reduced by Soilwater_Loss_Factor.

 

Storage for the SCS CN method is CN Storage and for the Initial Loss Method is Initial Loss. This storage is limited to wet and dry condition values you enter or my automated values:

 

For CN method, if you do not enter the parameters CNwet and CNdry then:

CNwet=CN/(0.4036+0.0059*CN)

CNdry=CN/(2.334+0.01334*CN)

  

For Initial Loss method, If you do not enter the parameters ILwet and ILdry then:

ILdry=InitialLoss

ILwet=0 

 

Add the last parameter code on the end as -1 to create the file 'Runoff.txt' and try graphing the output info to see what is going on. GetMapArea and GetRealtime's runoff wizard have both been updated to easily show graphically the effects of parameter changes.

 

GetRealtime_setup.txt shift cell parameters examples:

Initial Loss Method:

6.5,1.1,0.1,2.4,90.8,1.0,0.2,0.4,3,0.3,6,-1 <<<auto Loss limits and -1 means create info file 'Runoff.txt' 6.5,1.1,0.1,2.4,90.8,1.0,0.2,0.4,3,0.3,6,1.4,0.5,0.5,-1 <<<has initial loss limits and SoilLossFactor

  

SCS Curve Number Method:

6.5,70,0.1,2.4,90.8,1.0,0.015,0.4,3,0.3,6,-1 <<<auto CN limits

6.5,70,0.1,2.4,90.8,1.0,0.015,0.4,3,0.3,6,83,60,0.5,-1 <<<has CN limits and SoilLossFactor

  

Do not forget to remove the -1 to create file 'Runoff.txt' parameter when you do not need it and use the Runoff Wizard for the full monty.

  

Example of optional runoff coefficients for best results:

Update 1/6/13: You may be wondering where the 'Parsimonious' went with all these options and I do too... At least remember this>>> See the 'Compare DSID' box on the runoff  'What If' screen above. Here is where you can compare your calibration run to the USGS runoff graphically. You enter the USGS DSID here and Hit Cancel if not 'What IF'ing and it will add the hourly USGS graph to your run's graph. This is something not to forget and always remember.

  

The parsimonious model cited here simulates 2 storages, the unsaturated zone soil moisture using the CN method, and the saturated zone using a constant outflow parameter (but no direct return from the soil). I have combined the 2 into one storage where groundwater outflow is scaled with the amount of CN storage. When storage hits CNdry, there is no groundwater outflow (even less... uh more parsimonious!). Perhaps at a later date I may see the error in my ways and use their 2 storage method instead but for now I like what I see. 

 

Update 11/23/2012: I have seen the error of my ways and added a groundwater table storage component using the linear reservoir method (see runoff unit graphs above). Although the single soil storage provided by the SCS curve number or Initial Loss method works very well for wetter climates, it was found desirable to add a groundwater table out west and possibly for droughts. Setting the new Groundwater Loss Factor to 1.0 for wetter climates will provide the same results as before. The new groundwater table storage receives channel recharge as 20% (hardly noticeable) of runoff OR as saturated soil when infiltration is above the max of the soil storage range OR as unreturned soil infiltration Soil Loss Factor * Infiltration * soil storage range  Factor above.  The recovery rate and groundwater outflow K-factor is Constant_Loss * Recovery_Factor for the soil above as in/hr.  The daily starting Groundwater Table outflow uses the datatype_ID of 5 and no GetAccess setup is needed unless you wish to view them.  Why groundwater table recharge during runoff and not just above maximum soil storage??? For arid lands, about the only time there is water table recharge is when there is flow in channels, not because of saturated soils (capillary action is always back up for these blood sucking cactus and they don't call it caliche for nothing). And what if you over estimated max soil CN storage, well I just fixed it for you, parsimoniously.  Unreturned soil infiltration in most cases will be far the largest inflow component to the ground water table but you can juggle all these factors to fit and by no means is this a mass balance, but after doing a few basins you may start to see some rhyme and reason to these factors. Let me know if you do. ;-)   So far I have found that the GW Table returns are used for the faster runoff recession and the Unsaturated Soil returns are used for the longer duration base flow recession but I guess you could reverse this (?).  In a snowmelt area with bogs in New York I reversed this and use a very slow 0.05 Ground Water Recovery Adj.  The GW Table returns can also be lagged by setting the GW Lag value in hours for a Muskingum routing of the linear reservoir return with a Muskingum X=0.  The Muskingum routing overcomes the linear reservoirs flashy filling that seems just unavoidable but very acceptable for most fast interflow needs for those not making their own dimensionless unit graph. Just another tool that probably no one needs.

 

A Groundwater Riparian Factor can be used for seasonal streamloss or even gain by applying this factor to a sine wave (+1 to -1 on Aug 1, 0 on Nov 1) times the unsaturated zone soil (not groundwater table but includes Base Flow if any) returns: Qsoiladj= Qsoil + GRF * Qsoil * Sine(date). Why a sine wave and not adjust based on ET???  Solar radiation is a sine wave peaking on June 21, crop use factors lag it into August, and the Riparian Factor fudges the streamside area and Kc so it is ET.  Update 9/19/2013 now includes the Base Flow in the riparian adjustment.

 

GetRealtime's flow accountings:

  

Although "A Parsimonious Watershed Model" cited used a daily time step, it was my intent at first that GetRealtime be used only for real-time 5-minute Nexrad Radar rainfall events of a day to a week in duration. But in actually doing some calibration over a two month period, GetRealtime also would work just as well for a year long or greater period at 5-minute, hourly or daily time steps with much more sophisticated snow melt and ET than proposed by the model cited with out introducing any additional calibration parameters... just wouldn't be as cool as real-time though. I had not planned on it, but the cited study authors make such a good case that I foresee that I will be adding examples for longer duration studies also just to see if it really works so check back often.

 

To check for issues for long simulations I ran an HOURLY runoff computation for 100 years on my Windows XP Home Edition with 1.28 MB memory. It takes 7 minutes to complete the computations with 328 MB of peak memory use. I also ran a 5-MINUTE runoff computation for 30 years and it took 8 minutes for the computation and another 1 minute to write the hourly values and 5 minutes for unit values to the GetAccess database with 902 MB peak memory.  I tried 200 years of hourly history but something limits the start to 1900 so I tried 200 years into the future and it ran fine with 666 MB of peak memory. So I tried 300 years into the future and used 938 MB ok. 400 years???... finally "Out of string space!"... out there somewhere... beyond the beyond... something is controlling transmission... there is nothing wrong with your computer... you have reached... The Outer Limits!!!  (Actually I did run 500 years but used 1400 MB of virtual memory. Don't get your hopes up, 32-bits can only use 2000 mb.)  I did not know that GetRealtime would run into the future but it does.  It could project future streamflow recession assuming there is no rainfall.  Hmmm... somebody clever enough should look into how to read the NWS's QPF rain forecasts and create a future hourly rain record. I know they do it somehow.  (I have, see below.)

  

If anyone is actually running 100 year simulations, something to consider for speed is that during EXCESS PERIODS a regular dimensionless UHG will be 3 times faster than a three triangular UHG and a linear reservoir UHG will be 100 times faster than a dimensionless graph due to convolutions... in theory.

  

After giving my all to get this working, I just learned HEC-1 was updated in 1998 to handle continuous simulations as HEC-HMS... dang!!! Just when I thought I had invented a new mouse trap. Where have I been?!?  But if it's any consolation to your confidence in me, I also just learned while reading up on HEC-HMS, that Leo Beard wrote the original HEC-1 Fortran programs.  Hey Leo!  I worked with and learned from Leo in the mid 80's on flood frequency's but he never let on he wrote HEC-1, although he had Fortran code for everything else to share.  But I see the HEC-HMS Soil Moisture Accounting uses the infamous Green-Ampt; not very parsimonious of them. So give my methods a try and compare. You're going down Leo. ;-)  Don't worry.

 

Example of a 5-minute calibration for a 32 day period for an urban area with real-time radar adjustment:

  

Example of a groundwater table with the Base Flow Riparian Factor used to increase streamflow.

 

Hydrograph Recession Slope (specific yield as RecoveryFactor):

 

Did you know that you can estimate the SOIL groundwater recharge rate (ConstantLoss*RecoveryFactor) by the hydrograph recession slope and vise a versa, well now you do.  Where this could come in handy is where you know all the streams in your region have a similar base flow recession hydrograph slope measured as days per log cycle on semi-log paper. If you think you know the slope then, knowing the soil type you can get the Carson's Recovery Factor for 0.1 in/hr Constant Loss from the table above. The table below will then give you the CNdry... or any combo of the 3.

 

For example, your recession slope index is 18 days, knowing the soil type is Silt-Loam B will give a Recovery Factor of 0.17 for Const Loss 0.1 in/hr which is 0.017 CL*RF then from the table below the CNdry=65. If you have more faith in your CNdry than your soil map, then say you think CN_dry is 65 then, your CL*RF is 0.02/constLosss or Recovery Factor=0.02/0.1=0.2.

 

  

For the Linear Reservoir method try K factor = −ln (Q2/Q1)

 

 

Update 7/25/2013: GetRealtime 3.1.2 includes a variable Lag time that made a very significant improvement in some peak flow comparisons for both bias and %MAE. My variable lag is computed as Lag = Lag0 - PrecipSum / SoilSpace * Lag0 / 2.  For the Initial Loss method Lag = Lag0 - (PrecipSum / 3*InitialLoss) * Lag0 / 2.  This reduces the Lag0 entered for dry conditions to 1/2 Lag0 when saturated. The lag time and all unit graphs are updated on startup soilspace or precipsum and when the computed Lag changes by 5 minutes. So as rainfall increases, soil space is reduced, and the Lag time is reduced. Under dry conditions soilspace increase and Lag time increases. Pretty clever eh!

 

I highly recommend the variable lag so use it when you first start calibrating your basins. I never have, so when a big peak comes along, I end up checking variable lag and just hope it fixes the lagging peak, so include it when you start out calibrating even smaller peaks. I usually end up increasing my oringal smaller peaks lag time by 20% with variable lag once a monster peak comes along and I have to use it so be smart and be ready already.

  

  

Update 8/3/2012 use  your ET record for any Wundergage with Temp, Humidity, and Wind:

To use your own computed ETo from a nearby Wunderground gage simply include the ET datatype_site_id in the table 'rsite' parameter code like 10812,-27811 on your runoff record for either station_id COMPUTE or NEXRAD-ESX as shown below. See the Getrealtime computation examples for setting up your Wunderground ET computation.  Any missing ETo values in your record will use the default values above.

 

 

 

Update 10/13/2013--Added the 'Directly Connected Impervious Fraction' as cascading planes.  This allows the fraction of the % Impervious not directly connected to be removed from the excess and added to infiltration but limited by the Curve Number loss method. A tip of the hat to Ben Urbonas and his "Stormwater Runoff Modeling; Is it as Accurate as We Think?"

  

From Colorado Urban Drainage and Flood Control District for urban catchments.  The Levels in the graph are for the amount of Low Impact Development added between the impervious areas and collecting channels.  For 38% impervious shows 80% directly connected for minimal low impact development (Level 0).  This was a very important addition and badly needed for urban areas with large %Impervious.

Level 0: No improvements, Beaver Cleaver's neighborhood.

 

Level 1: The primary intent is to direct runoff from impervious surfaces to flow over grass-covered areas and/or permeable pavement, and to provide sufficient travel time to facilitate the removal of suspended solids before runoff leaves the site, enters a curb and gutter system, or enters another stormwater collection system. Thus, at Level 1, to the extent practical, impervious surfaces are designed to drain over grass buffer strips or other pervious surfaces before reaching a stormwater conveyance system. Houses like mine with clogged gutters.

 

Level 2: As an enhancement to Level 1, Level 2 replaces solid street curb and gutter systems with no curb or slotted curbing, low-velocity grass-lined swales and pervious street shoulders, including pervious rock-lined swales. Conveyance systems and storm sewer inlets will still be needed to collect runoff at downstream intersections and crossings where stormwater flow rates exceed the capacity of the swales. Small culverts will be needed at street crossings and at individual driveways until inlets are provided to convey the flow to the storm sewer. The primary difference between Levels 1 and 2 is that for Level 2, a pervious conveyance system (i.e., swales) is provided rather than storm sewer. Disconnection of roof drains and other lot-level impervious areas is essentially the same for both Levels 1 and 2.  Town of Bedrock, Fred Flintstone's neighborhood.

 

Suggested adjusting for Composite CN with unconnected impervious area (TR-55):

 

%Impervious>30: CNc= CNp+(Pimp/100)*(98-CNp)  not recommended.

%Impervious<30: CNc= CNp+(Pimp/100)*(98-CNp)*(1-0.5R)

where

CNc = composite runoff curve number

CNp = pervious runoff curve number

Pimp = percent imperviousness

R = ratio of unconnected impervious area to total impervious area.

 

Example:

1/4 acre residential Table 2.2a for Soil B gives 38% impervious and CN=75.

Connected Impervious Area from the Colorado figure above for Level 0 low impact development CF=0.80 for directly connected.

Composite CNc= CNp+(Pimp/100)*(98-CNp)*(1-0.5R) = 75+(38/100)*(98-75)*(1-0.5*0.2) CNc= 82.9

(or CNc=83.7 for %Impervious > 30% but throws out all of Colorado's work.)

 

  

Update 12/18/2013: Runoff computations done with datatype_id=30 when run by the number of past days now checks for a recession runoff date on or after the start of computations date and if it is then the starting day is moved to the day of the starting runoff plus 1 day. This insures you are not cutting off a previous recession. The start and end of runoff are now stored in the table 'rupdate'. The earliest starting day for the run is then used for all subsequent routings including the HEC-Ras and HEC-Hms shell routings. Runoff computed by historical dates do not store anything in the table 'rupdate' and do not self adjust the starting date.  If you wish to override or clear the recession dates from the GetAccess table 'rupdate' just use the SQL checkbox and statement 'DELETE * FROM rupdate;'.

  

  

As I gain experience re-running past studies I will update any tips I might discover and perhaps add some comparative graphs somewhere. With only the 3 parameters CNdry, ConstantLoss, and RecoveryFactor to play around with it's really not too bad, specially getting started by graphing the daily mean flows for calibration. I have already seen for the CN method that ConstantLoss and the Recovery Factor only move the hydrograph up and down so they are just some what of a base flow gizmo and that only leaves CN. WHAT COULD BE EASIER!!! (Let's see it work first, happy jack.)

 

Here are some things to remember about the continuous CN method:

0) At startup, S, P, and gwQ are read from HDB.

   datatype_ID 4 = Soil Storage Space S at end of day.

   datatype_ID -4 = Pricip residual P at end of day.

   datatype_ID 5 = Groundwater flow rate gwQ at end of day.

  

   S is soil SPACE between say CNdry=10 inches and CNwet=1 inch.

   Srange= 10 inch - 1 inch= 9 inch used for soil recovery adjust speed S/Srange.

  

   Usually CN is only used to set the dry and wet limits of S. CN is not used after that but can be computed for display as CN=1000/(S+10).

  

Now the rainfall increments loop:

  

1) ===========WHEN RAINING============

   P=P+Rain increment

   S2 and S8 not changed

   abstraction = P - S2

   qp = (abstr# * abstr#) / (P + S8)

   loss = Rain - (qp - oldqp)

   excess = Rain - loss

   Infiltration= Rain - excess

   S = S - Infil (reduce SPACE)

   Sgw = Sgw + excess * 0.2 (this is magic groundwater made available if needed)

  

2) ===========WHEN NOT RAINING========

   P is reduced by evap and dGW (both by a factor based on P/Srange for fast recovery).

   S = S + evap (increase space)

   S2 = S * 0.2

   S8 = S * 0.8

  

3)============ALL THE TIME============

   S = S + dGW (increase space) where dGW is soil rate recession slope factor * S/Srange (slow recovery).

    

   Send Excess to unit graph.

   Send dGW to base flow

   Send linear reservoir routed Sgw to base flow if wanted.

End of loop

 

 

My recipe for getting started at a new site now is:

Go here first, it's much simpler:  http://getmyrealtime.com/HarvardGultchExample.aspx

  

1--Use SCS dimensionless graph to start.

2--Set the lag time.

3--Adjust Initial CN = Dry CN + 1 to get the first peak.

4--Adjust Recovery Factor to get the next peak.

5--Set the % impervious to get the minor peaks below CN excesses.

6--Set Ground Water Loss Factor=1.

7--Play with the Soil Loss Factor to get the base flow.

8--Lastly try mixing up the Ground Water and Soil Water Loss Factors for base flows and try the Triangle Unit Graph or Linear Reservoir to better shape peaks.

9--Tweak the whole period as the months go by.

 

And it may help to step back and take a look at all this and  remember where the water went so always remember water is only lost by (a) Runoff and (b) Evapotranspiration :

 

1-Starting with Rainfall.

2-The CN loss method creates the Runoff.

3-Infiltration is the starting Rainfall minus Runoff (minus what gets stuck in the trees).

4-Infiltration goes into the CN storage.

5-When it stops raining, ET will reduce the CN storage.

6-Now there are no more losses beyond this point.

7-CN storage is reduced at the max rate set by constantLoss*recoveryFactor and returned to runoff.

8-The adjusted rate at which CN storage is returned depends on the current space (Sdry-S)/(Sdry-Swet).

9-The fraction returned in step 7 is set by Soilwater_Loss_Factor. (ok, the fraction not returned is lost unless the GW table is used.)

 

To start, look up your Soil Type and cover and get a CN. The Wizard will give you a corresponding CNdry. For your first runoff event, hold the CNdry constant and vary the starting CN (this varying CN is what the model will keep track of after the first event based on the recovery factor). If not satisfied, or after more events, adjust CNdry or recoveryFactor to improve peaks, ETfactor to flatten recession, and Soilwater_Loss_Factor to move it all up and down. If groundwater return is giving you a problem, then try the optional GW table.  Remember also that the hydrograph log-linear recession slope is also the constantLoss * recoveryFactor (see above graph).

 

Calibrations, here is my latest thinking 1/1/2016:

 

If you got things halfway working then skip steps 1, 2, 3, 4, 5.

1) Set Ground Water Loss Factor to 1.

2) Set Soil Loss Factor to 0.9

3) Set CN value to CNdry+1.

4) Set CNdry and leave CNwet at auto.

5) Try different DGF's a) VillageCr b) LittleTallapoosa c)Maurice d) Tallapoosa. the other dgf's seem about all the same. then Clark, then Linear Reservoirs, then Triangles, then make your own DGF.

 ------------------------------------------

6) Set LONG TERM (a month) recession slope using MOST Important Soil Recovery Factor

7) Set Soil Loss Factor to give longer term flow or raise Base Flow.

8) Set Ground Water Loss factor and speed to get peak shape. Sometimes you may want a very slow gw return (0.05) if boggy and you need your soils drying out faster for CN. This reverses the Soil and GW roles.

9) Here you can go to steps 1-5 or continue below.

--------------------This here is optional------------------

10) Try changing Initial Abstraction (almost like changing CN but easier)

11) Try raising interception factor (.05) for forests.

12) Change %Impervious and Direct Connect for fitting small events.

13) Repeat 10,000 times until hair is gone... then do some thinking now that you see what each factor can do.

 

The runoff coefficients can now be checked and entered by using the 'Wizards' button on the Select Station List. Just click on your station, if present, and then click 'Wizards'. You will see a complete breakout of all your coefficients for editing.

  

  

 

Wizard CN selections Nov 23, 2020:  I've updated the Runoff Wizard CN automated dry and wet defaults when the CN value is changed. Previously the wet and dry CN's were computed from the entered CN being the middle of the range. Over the years I usually select a CNdry value as CN-1 and enter that in the CNdry text box. A lot of basin within a few days dry out to the CNdry value so that's really the starting CN you want to calibrate for. Also when starting a new basin, uncheck 'Last Run's Storage' to use the middle CN value otherwise the CNdry is used for missing storage. For continous modeling with 'Last Run's Storage checked the middle CN is not used so all this is academic with in a few days. Historical calibrations probably would use CN-1 for dry starting also ('Last Runs' Storage' unchecked).

 

Old way:

CNdry% = cn! / (2.334 - 0.01334 * cn!)

CNwet% = cn! / (0.4036 + 0.0059 * cn!)

New way:

CNdry% = cn! - 1

CNwet% = 27.542 * Log(CNdry%) - 23.032

 

Getting all 20 of your CALIBRATED subs back on track (3/19/2015): Note that your base flow rises from the out of the box dry condition with each rain event to eventually match the USGS gage flow. You could uncheck 'Last Run's Storage' to start with wetter conditions of the initial CN and watch the base flow drop to eventually match the USGS gage flow, or try it both ways. But after the first day remember to recheck the 'Last Run's Storage' checkbox. This checkbox is only unchecked for startups and should always be checked there after for continuous modeling. For event modeling it would be unchecked.

 

The best way to quickly ADJUST ALL SUB'S base flow recessions and get back on track or up to speed is the 'Base Flow Adjust' on the 'Scheduled Batch' menu (uncheck Batch after setting). Try changing the '0.25' precip value to 1.25 and run and compare to the USGS flow and adjust as necessary. So if your adjusted radar or snowmelt really screws up, you can use this 'Base Flow Adjust' to get things back on track and ready for the next event.

 

Besides 'Base Flow Adjust', another easy way to adjust runoff is to use a factor on the Gage/Radar ratio like 1.3:

 

COMPUTE-hour; -31251; Ratio; Avg G/R Ratio Oneida Cr; 0; .02,1.2, 5; 1.3*P1/P2

 

Using a factor to adjust losses was tested and works very well but it screws up the base flow so was not added to GetRealtime code (except snowmelt).

 

If you would like to save multiple scenarios and variations on any of  the computations you can by simply attaching 'Run-' to the GetRealtime_setup.txt file's 'Station_ID' like 'RUN-COMPUTE-unit'.  Multi scenario computations are described at the bottom of this page.

  

 

Radar Rainfall Adjustment

THE VERY MOST IMPORTANT THING OF ALL!!!

  

I have finally given up thinking radar rainfall could just be used as is for rainfall-runoff computations for sub-regional/local use. I had thought that adjustment of the runoff coefficients would be sufficient to overcome radar random errors and certainly seasonal bias. Wishful thinking. So a 'Radar Adjust Wizard' has been added to help you setup this much needed adjustment in GetRealtime. The above calibration example used this radar basin rainfall adjustment method described here using 2 gages. My Las Vegas Valley radar and runoff study uses single and average ratios for a more in depth look at how hourly radar adjustments improve point rainfall and runoff computation as well as do all of my current radar studies listed on my SiteMap.

 

The adjustment is based on hourly (or 2hour or daily, forget unit values) ratios of a rainfall Gage or gages and the radar point rainfall at the Gage or gages.

 

Ratio = (G1+G2+G3...)/N / (R1+R2+R3...)/N

 

Note that for multiple Gages, the method used here is the ratio Gage_Average / Radar_Average.

 

Three Gage/Radar ratio methods came to mind when using more than one Gage:

1) (G1+G2) / (R1+R2)   ...Ratio of the sums.

2) (G1+G2)/N / (R1+R2)/N  ...Ratio of the averages.

3) (G1/R1 + G2/R2)/N  ...Average of the ratios.

  

Methods 1 and 2 are equal when all Gages are available, but method 2 was selected because it was not a trivial task to perform a conditional sum and remove the paired missing gage and radar in real-time. Method 2 will remove the missing Gage in the averaging  setup correctly but alas, both R1 and R2 will be averaged for the denominator. If you have a bad apple Gage, remove it yourself.

 

Method 3 seemed like the best choice to overcome missing Gages until in actual practice it was found to be averaging the default ratio 1 whenever rainfall was not present and so was rejected.

 

In your GetRealtime Gages and Radars Averaging setups be sure to use the variable 'N' and not the actual number of gages, ie:   (P1+P2)/N   not   (P1+P2)/2

 

COMPUTE-Hour; 10818; Rainfall; Watauga Basin Avg of 3 Rain Gages; 0; 0; (P1+P2+P3)/N

And your GetAccess table 'rsite' parameter_code could be 10811,10814,10816 with station_id COMPUTE.

(similarly for your radar points)

COMPUTE-Hour; -10818; Rainfall; Watauga Basin Avg of 3 Radar Points; 0; 0; (P1+P2+P3)/N

And your GetAccess table 'rsite' parameter_code could be -10811,-10814,-10816 with station_id COMPUTE. 

(Try COMPUTE-2Hour and COMPUTE-Day for long distances or winter.)

  

If (P1+P2)/2 is used then if one Gage is missing, a missing value is computed. How you come up with the Numerator and the Denominator is up to you... actually the Ratio record itself can be up to you, but for real-time computations within GetRealtime the above method seems the best approach so far.

 

These hourly ratios will then be applied to the 5-minute basin average radar rainfall record. The 'Radar Adjust Wizard' shows you how to construct this ratio setup. It also shows you the setup for adjusting the radar record with this ratio record.

 

Update 12/21/2012: If you run out of datatype_id 10 for precip/melt then you can now use datatype_id 11 interchangeably with 10 so you may wish to use dsid 11xxx instead of 10xxx to show that it is an adjusted rainfall. The GetNexrad boundary display option 'Show current list values' expects rain gage values to be datatype_id 10, radar to be -10, and adjusted radar to be 11 or -11. GetNexrad will then display the gaged values at the rain gages and adjusted radar for the subbasins.

 

Update 12/12/2013: Here is a tip for dealing with erractic rain gage records. Say there are 4 Wundergages in your area of interest. You have been using gages 2, 3, and 4 but not 1 for a 3Gage average. Now say gages 3 and 4 start to under report rainfall and you would rather use gages 1 and 2 but not 3 and 4.

 

Here is the tip, in the GetAccess HDB table 'rsite' parameter code put all 4 gage dsid's in the proper order as P1, P2, P3, P4,...etc.

For the 'rsite' table parameter codes: 10612,10613,10614,10617

P1= 10612

P2= 10613

P3= 10614

P4= 10617

With all 4 dsid's now in 'rsite' you can change the formula in GetRealtime_setup.txt as needed with out changing the table 'rsite'. So to use just rain gages 1 and 2, then the formula is now (P1+P2)/N. If you want to change back to gages 2,3, and 4 then (P2+P3+P4)/N. And don't forget to change your 'rsite' table radar parameters and GetRealtime_setup.txt formula to average the proper radar also.  This update requires GetRealtime.exe as of 12/12/2013.

 

Update 4/18/2014: You can now use GetGraphs.exe menu 'Gage Off/On' to toggle the use of any of the gages used in GetRealtime's (P1+P2+P3+P4)/N computation. This allows all the gages in the area to remain listed in both 'rsite' and GetRealtime_setup.txt and using GetGraphs menu to turn off offending rain gages and radar at the gages. GetGraphs creates the file 'GetGraphs_OnOff.txt' which lists the offending gages that is read by GetRealtime.exe. GetRealtime assumes the on/off file will be in the GetGraphs folder but if not it will look next where ever your HDB database file is located. The GetGraphs plots will have a yellow background when OFF. Be careful when doing historical studies to remember what gages were historically usable for your historical period and to reset your offenders for current use.  'GetGraphs_OnOff.txt' is re-read every time GetRealtime runs so that even in batch mode any changes will take effect immediately.

  

To help you evaluate bad rainfall gages you can add a 3 on the end of your GetGraphs_setup.txt line like this:

.....; 65535 ; 16711680 ; 255 ;;;; 2 ; 3

For this example, the first 2 means cumulative plot, and the 3 means you want to display the Mean Absolute Error between the gage and radar. The difference between gage and radar are only calculated when either value is greater than 0.05 in/hr. The n value is the number of errors averaged. To make use of these MAE's you could compare them to each other. If several are near 50% and 1 is 80%, then the 80% gage could be turned off. Likewise though if most are 80% and 1 is 10%, turn off the 10% gage. The gage weighted MAE= Sum G*|G-R| / Sum G*G. You can even use this MAE on your flow graphs to get a 2nd opinion of your eyeballs.

 

In the following example MAE's, I always know that P8 Nucor Steel is the best rain gage ever so I would turn OFF the gages that strayed from it's MAE:

  

Remember and NEVER forget that radar adjustment is the most important thing of all. The more thought and inspection of your ratioing scheme the better and cannot be stressed enough. So instead of averaging gages to ratio as the example above, it may be better to subdivide the basin by how many rain gages you have to work with. I'm no expert so you should put more thought into this than I have.

 

Here are some Ratio tips that I have also included in the 'Wizard Notes':

1) The current hour adjustment will be the previous hours ratio until 30 minutes of radar values are available and the difference between gage time and radar time is 10 minutes or less.

  

2) Don't use Inversion Suppression when adjusting radar rainfall, use the Ratio's Gage Minimum value of exactly 0.01 near radars and the radar will be zeroed when Gage is zero.

 

3) Setting a high 'Gage Minimum' of say 0.05 will better insure a viable ratio is computed... but you will have to suppress the inversion layer near radars yourself. The above calibration used 0.01 'Gage Minimum' to suppress the inversion layer and trust to luck for the computed ratios.

 

4) Using 1 good rain gage for 1,000 square miles is better than half a dozen bad ones.  The assumption made here is that radar is spatially perfect for your region. Given just one of it's point rainfall values, your region of the radar screen is then known... and wishes were fishes.

 

5) Never use an Wunderground airport (KABC) rainfall record for ratioing because it is impossible to tell which hour the rainfall fell in. Airports reset their rainfall at hh:53, hh:55, or hh:56 and should be avoided.  This is no longer true. Because of need for good data, GetRealtime now makes better use of Airport gages buy assuming hh:5x is the end of the hour and bases the hourly rainfall at this time. Not perfect but may be better than most other rain gages, so I would include NWS Airports when there is not much choice (winter).

  

6) For radar overshoot in winter you could adjust the radar when 0 by using the HOURLY average of the rain gages and adjust the radar like this:

Get radar:

NEXRAD-BGM; -10301; Radar Rain; Oneida Cr Sub1, NY

Get gage average:

COMPUTE-hour; 10251; Rainfall; Avg Rain Gages Oneida Cr; 0; 0; (P1+P2+P3)/N

Replace radar 5-minute values with gage average/12:

COMPUTE-unit; -10901; Radar Rain; Set Radar 0 to gage; P1=0; 0; P2/12; P1>0; 0; P1

 

where P1 and P2 are: -10301, 10251 in rtable parameter_codes and radar must be first because 5-minute values. Continue with the radar average and computation of the g/r average ratio.

 

It was found that in larger basins where a hailstorm can be present and no rainfall at the ratio gage, the default ratio value of 1 was a very very poor default. So using the last computed ratio worked exceptionally well in this particular case; the last computed ratio was 4 hours old.  Only ratios based on rainfall at the gage greater than 0.20 inches will be carried forward.  The ratio will be reset to the default ratio after 6 hours of no radar rainfall at the gage if it was greater than the default or after 1 hour if less than the default and again will probably only matter to large basins.

  

And again it is imperative to evaluate the rain gage somehow. I only use Wunderground rain gages available free in realtime (parameter_code = dailyrainin and NOT HourlyPrecipIn). One way to evaluate a gage is to compare the total rainfall for the year to date with other nearby gages and usually the best ones have the most rain. But some may have just been down for some reason. Then you should look for gages that have a fast rise and fall because half of them are plugged with leaves. And the ones with the best 5-minute time steps are preferred because getting a good reliable time step seems to require some sort of operator genius who will also properly maintain and calibrate the tipping bucket. And it is nice to have a rain gage located upwind and outside the basin so that a ratioing value will be available instead of using a default as the storm approaches. If you use 1 gage it should be near the basin centroid, and if 2 gages then they should be at opposite ends of the basin. And if no gages are in the area, maybe making a distance vs rainfall for daily values at what gages you can find and just use a daily distance corrected factor?... but do something at least every day! And check a high speed radar loop of your area of interest to check for beam blockage and if so then your rain gages should have some relation to beam blocked areas. The blocked areas will appear as static radial lines or pie slivers of a slightly different color radiating from the radar center as the rapidly moving rain images pass through them and are much more common than you might think. And in winter be sure your rain gages are heated or you will get rainfall everyday for the next week beginning at 8:00am.

  

I would test and retest this section because... It's the very most important thing of all!!!

  

Note the order of the 31229,-10230 paramters in table rsite. Do not reverse.

  

  

Beam blocked areas can also be revealed when a heavy inversion layer causes ground clutter like this BMX radar image for Birmingham, AL. Of all the places for a radar to be blocked is its city namesake. Birmingham requires a default ratio of 1.5 to overcome the beam blockage.  Ground clutter is removed on the NCQ and A2M radar images.

For comparison, here is Huntsville, HTX unblocked image for the same period:

  

Update 12/5/2012:

(See the much improved second Nowcast method update 12/16/2015 just below after reading how to set a track here.)

 

 Use GetNexrad's radar tracking with 'Storm track forecasting' to provide up to 1 to 3 hours lead time based on storm type and tracking speed. Your basin or point of interest is offset upwind to tracking point #1 (Eulerian frame of reference) for rainfall computation where distance and storm speed will compute the nowcast lead time. Write the nowcast unit values to GetAccess for real-time rainfall runoff and GetRealtime will overwrite as true radar basin rainfall comes in. 

 

I have not had much experience with this yet but some experimenting on how to add multiple forecasts and in which order they overwrite into the GetAccess database should get you going. Meaning if you had a 2 hour radar loop going, you could start a track out at 100 miles, write it's forecast values to the database, then a track at 50 miles, then one at 20 miles... or pick up and move (convective) every 10 minutes so you can see why some experience will help.  You be the judge.  And copy/rename the file RainLoop.txt for later evaluation of these forecasts.  Use GetAccess to add multiple runoffs graphically as you go.  Aint science fun!!!

  

And to kick it up a notch, add the NWS 6 hour to 72 hour Quantitative Precipitation Forecasts.  Just use GetNexrad's Surface Obs' QPF option to set the start and duration and eyeball your basin's QPF precip and away you go.  (Update: This is old school. Just use FORECAST-NWS in GetRealtime_setup.txt per below.)  The QPF future precip will be reduced for actual radar history and nowcast values.  Your GetAccess database values will be 'source' coded as 13 for real radar values, 14 for tracking nowcast values, and 15 for QPF values.

  

====================================

One Way to do a GetNexrad Nowcast:

 (Eulerian frame of reference)

1) Using Iowa Mesonet get a 1-hour radar loop going (1-hour can vary) with a single subbasin boundary (not the Show List boundaries) with GetNexrad.

2) Zoom in some to your area and then double click the radar image to clear any mouse clicks.

3) When the radar loop reaches the end of the loop, click the top center text box to start stepping mode.

4) Click on the leading edge of the approaching storm cell for a set point, now back step to the beginning of the loop, and then forward step 2 to 4 steps and click the leading edge of the storm again.

5) With your 2 set points now set, click on the top center text box again to let the loop start looping again.

6) Check that the yellow circle on your tracking marches in lockstep with the leading storm edge. It may take a few tries so repeat steps by double clicking the radar image to clear tracking and start over at step 3.

7) With a nice tracking click the menu 'Overlays' button (not check box), then optionally uncheck the 'Storm track show set points' for better viewing, and check the 'Storm track forecasting' check box.

8) Your subbasin will be displayed at your set point near the start of the loop. Note why you didn't use the first loop step is because the storm would have been halfway through your basin.

9) Note the rainfall 'Sum' on the windows caption. It will show you if this Nowcast is worth using.

10) To use this Nowcast, click the main menu 'Save' button, check 'Save Radar Rainfall Loop', and click OK.

11) On the next menu check 'Send to GetAccess HDB', AND IMPORTANTLY change the 'DataType_Site_Ids' from the radar -10 ID to an adjusted radar ID of -11 or 11 depending on your setup, and click OK.

12) When the loop reaches the end, a message will tell you how many values were sent to the database.

13) Close GetNexrad.exe and open GetRealtime.exe and from the Select Station(s) list, select the -11xxx adjusted radar or NWS forecast station for the sub you used and click the 'Wizard' button.

14) From the 'NWS Forecast and Values Edit Wizard' window, on the lower left corner change the value 6 for Hr's to 2. Don't use 1 until I fix it for 1. Don't use 6 or the NWS forecast cannot update them. Now click the 'Edit 6-Hour Forecast' button and the next 2 hours of forecast will be displayed.

15) Note the date_times and also note the leading 14 and 18 where 14 is you Nowcast and 18 is the NWS forecast. Check the hourly values are what you want. You can edit the values if you like (even skip the whole Nowcast and just edit these).

16) When satisfied, check the 'Write to all forecasts', and click the 'Save' button.

17) That's it, these 2 hours are now marked Source=14 so as not to be overwritten, all -11 dsid forecast are updated for all subs.

18) The radar will overwrite the source 14 as it comes in or you can overwrite manually with the NWS Forecast again by checking 'Overwrite Forecast' on the 'Scheduled Batch' menu and run the NWS forecast stations to replace the Nowcast 14 values with 18 NWS Forecast values.

19) Repeat your Nowcasts or edits as needed for longer storms and practice helps. You don't want to be reading all this when the lights go out.

 

And if you forget to save the Nowcast as a -11 or 11 forecast dsid, you can use the Forecast Wizard to bring up the -10 radar dsid with source 14 Nowcast values and copy them to your clipboard, retrieve a real 11 forecast, and paste the Nowcast values on the same hours. You could even check your last radar adjustment ratio to adjust these values while your at it.

====================================

Update 12/16/2015: A much better Nowcast has been added to GetRealtime to AUTOMATICALLY provide maybe(?) up to 3 hours of forecast based on the LAST radar image GetRealtime downloads. Unless a low pressure center moves near the basin (update or a squall line forms), the storm speed and direction is pretty constant. So using GetNexrad similar to above you can save your storm track info to a file like:

 

NowCast_DAX_N0Q.txt that reads:

 

2015-12-17 19:39 :Date

2015-12-18 01:39 :Expires

0.6264341 :X pixels per minute east

-0.2337041 :Y pixels per minute south

24 MPH ENE 69°

 

GetRealtime can read the 2 east and south pixel speeds and on the last radar image move each of your subbasins up wind every 5 minutes to compute rainfall at that location (Lagrangian frame of reference). So for a Nowcast of 3 hours then 36 up wind locations of rainfall can be added in less than a second. The nowcasts will have source code 14.  The G/R ratio used for adjusting the nowcast period will be the average of the last 2 ratios last ratio and the default ratio.

 

You will have to add your GetNexrad storm track folder to the GetRealtime_setup.txt like:

 

 NowCast Info folder=C:\GETPROJECTS\GETBIGCREEK\GETNEXRAD\

 

... and on your radar setup line you add 'nowcast 3' like:

 

NEXRAD-DAX; -10420; Rainfall; Big Creek Basin, Ca; 0; nowcast 3

or save to mtables:

Run-NEXRAD-DAX; -10420; Rainfall; Big Creek Basin, Ca; 0; nowcast 3

 

AND... to adjust hour 2 values as average of hour 2 and hour 3 use:

 

NEXRAD-DAX; -10420; Rainfall; Big Creek Basin, Ca; 0; nowcast 3, adjust type 1

 

Adjusting hour 2 is no longer recommended but you can try it if your hour 2 seems to need it.

  

Adjust type 2:  To adjust hour 3 values for decay only as proration of adjust factor 0.6 to 0.4 for minutes 120 to 180 if 5-min hour 3 rain count > 0.05" greater than 3.  Assumes any big storm always gets smaller.  Remember we don't have to be good at hour 3, just better than NWS always being 80% low but without predicting doom.

 

Adjust type 3:  In addition to the adjustment for decay of type 2, type 3 will decay hour 2 only with a prorated 0.8 to 0.6 factor if 5-min hour 2 rain count > 0.05" decreases by more than 3 from last run hour 3.  Hour 1 will decay only with a prorated 1 to 0.7 factor if hour 1 count 3 less than old hour 2.   Hours 2 and 3 nowcasts are probably a fool's errand... so I'm up for it.

 

Actually I'm still figuing this out and I'll clarify all this eventually... hey it takes fool.  And to see the storm growth counts and differences you should put the main sub basin line first in your radar section of the GetRealtime_setup.txt because only one sub will be reported on. To see these values as counts look at the end of the radar rainfall section text scroll on the GetRealtime view. To view the storm growth count differences when in scheduled batch mode right click the GetRealtime icon in the system tray and select 'View radar' from the popup menu.  To view counts and reductions 'View Storm Count' on the GetRealtime maximized screen or if in batch right click the system tray icon and select 'View Storm Count" there and looks like "0 0 0 NNN" if no counts by hour or reduction. "-4 -3 4 YNY" would have had reductions in hour 1 (-4 diff) and hour 3 (4 count) if your scheduled batch interval was 60 minutes. If 30 minutes the check difference would be -2, 15 minutes or less interval the diff check would be -1. Hour 3 count check is always 4. An hour count of 4 out of 12 being greater than 0.05" usually indicates about 0.5 inches of raw radar rain for that hour.

 

NEXRAD-DAX; -10420; Rainfall; Big Creek Basin, Ca; 0; nowcast 3, adjust type 2

 

You can use GetNexrad loops with 'Overlays' checkbox 'Stormtrack from file' to view a field of tracking points based on your nowcast speed and direction to verify other points are consistant or still viable.

 

That's it. Once your storm track file expires the Nowcasts will no longer be added. And if you have some way to automate storm speed and direction then you can update the above tracking file yourself or shell your program from GetRealtime.  My first time out with this Nowcast I was tracking storm cells continously comming from the southwest. I also saw a frontal line of storms early in the day and thought so what. Then as the front approached, a squall line rapidly descended from the northwest (as general storm cell movement did not change) that my Nowcast was blind too aligning with my track and extremely overestimated the hours 2 and 3 of rainfall as the squall past. So when it really starts raining you better see where it's headed.

 

This is all I have to show so far: (I've started a gallery of nowcasts here.)

 

Update 1/3/2016: Automated tracking using NWS winds aloft. Once your manual tracking from above has expired, you can use automated speed and direction from the NWS winds aloft (same Lagrangian frame of reference method above but automated).   I think this is the input to the NWS big kahuna forecast model that updates each hour.

 

http://rucsoundings.noaa.gov/

 

The Lat/Long is that of your basin. After watching a few storms and being frustrated with the wind direction I hard wired the altitude for direction at 18000 ft. You don't have to change anything in your WindsAloft file, just know you set your altitude for speed, and direction is now automatically using 18000 ft. Update, optionally, you can add your onw altitude for direction >= to your speed alitude if you don't like 18000' like this in your WindsAloft...txt :

 

OP40 : title and not used

12000 :Altitude ft msl  speed

17000 :Altitude ft msl direction optional

33.56 :Lat

-86.73 :Long

30R75 :Optional supercell direction/speed adjust or 10R85 for fronts/squalls

 

Save to your tracking folder (GetNexrad) with radar code and product code:

 

    WindsAloft_GWX_N0Q.txt

 

Now the NOAA winds aloft will be checked up to every half hour and the storm track info file NowCast_DAX_N0Q.txt example above will automatically be updated.

 

If you haven't followed all the steps above, here are the files that need to be in the GetNexrad folder for GetRealtime to automate Nowcasts:

WindsAloft_HGX_N0Q.txt

NowCast_HGX_N0Q.txt

NowCast_HGX_N0Q_RECENT.txt    <<this just needs a title line

 

The "30R75" Method:

Not long after formation, instead of just "going with the flow", the dynamics affecting most supercell thunderstorms result in a modification of their course to the right of the mean cloud-layer wind. In 1976, R.A. Maddox formulated the 30R75 method as a relatively easy way to calculate the direction of supercell movement. The 30R75 method suggests that a supercell will move 30° to right of the mean wind direction at 75% of the mean wind speed.  

 

The values in 30R75 can be what ever you wish. I recommend trying 10R85 for fronts and squall lines. In fact, better safe than sorry, try a standard 30R75 as standard practice in the Southeast where super cells and squalls are more often than not.  You don't have to be exact with these speed and directions. You just don't want a squall line sitting right down your wind direction giving 3 hours of rain when maybe 15 minutes is more like it.  You can edit the WindsAloft.txt from the GetRealtime Station List screen radar Wizard button.

 

The above image is saved with the 'Winds Aloft Graphic' checkbox. I wanted something for GetGraphs to show the status of the Winds for nowcasting. I had forgot that GetGraphs could have just as easily shown the text file with out creating a graphic. Oh well I learned how it can be done if I ever really need to. Also note the line 'Reusing Windsaloft'. This means the NWS OP40 website returned a column of all 99999's for some reason. This happens from time to time. The Nowcast will be based on the last speed and direction available.

 

If you want to work on historical radar then download the historical radar images and then use GetRealtime 'Read radar from file' on the 'Scheduled Batch' menu and then uncheck 'Scheduled Batch'. Use GetNexrad.exe to create your storm track info file and be sure the expiration date is after the historical radar period.

 

Like all things about radar, better nowcasting comes thru experience with your basin and getting what you can with what the radar gives you on each and every storm. So Nowcast 1, or 2, or 3 and when to 30R75 it... or do a manual tracking... comes with experience. And as luck would have it, any nowcast is better than nearly all NWS forecasts for the next 3 hours so it's only up from here.  The greastest error in the convective nowcasts seems to be hours out due to random chaotic storm growth and decay. Next largest error is probably guessing at radar rain G/R adjustment and the least error is usually in storm speed and direction. Now we need to experience some really big storms to see how crazy these methods are when it really counts.  And don't embarrass your self like I did by using a winds aloft storm direction that lines righ up with a squall line instead of quickly crossing it.

 

 

Update 3/4/2013--GetGraphs can now lookup the rainfall frequencies for unit-values, hourlys, and daily's for display. The needed rainfall frequency table 'GetGraphs_Frequency.txt' example is in the current GetRealtime_setup.exe download. The format is easy if you use NOAA Atlas 14:

http://hdsc.nws.noaa.gov/hdsc/pfds/pfds_map_cont.html?bkmrk=al

 

Use the submit button for 'Estimates from the table in csv format:' and open in Excel and save as tab delimited text file as shown here:

 

The GetGraphs_Setup.txt needs to be edited to look like this with the ending '-1' for just the unit or hourly values and daily or ending '-2' for a full frequency curve of the actual maximum depths for the durations :

 

5 ;Frequencies; 10 ; 10 ;-1

 1 ; 10612 ;Rainfall Frequency;Hueytown, AL; 0;1;0; 6335701 ; 9684176 ; 16711680 ; 255 ;;;; 0 ;-2

2 ; 10613 ;Rainfall Frequency;Bham Airport; 0;1; 0; 6335701 ; 9684176 ; 16711680 ; 255 ;;;; 0 ;-1

 3 ;

4 ;

5 ;

6 ;

7 ;

8 ;

9 ;

10 ;

11 ;

12 ;   

 

The graphs below show the difference between a full -2 and a -1.

 

 

 

Rainfall frequencies are usually pretty linear on linear Rainfall Depth vs log Frequency so the interpolation is a linear look up of the duration depth and that fraction then of the log(years)... except for depths less than 1 year which is linear. Frequencies less than 1 year are the rainfall/1YR-Rainfall fraction. Wundergages unit duration values are rounded to 5, 10, 15, 20, 30, 40, 50 and 60 minutes for lookups so you have to think about your table construction, otherwise 5, 60, and 1440 should do it. And once again airports are a wild guess. The -2 assumes 5-minute unit values like radar and most Wundergages. If the record time step is erractic or more than 5-minutes, then the values will be divided into 5-minute increments.

 

 

Update 6/2/2014--GetGraphs can compute the Pearson Type III recurrence interval, either Log or Normal. for the maximum value on each graph. The recurrence interval in years will be displayed on the second title line as P=X yrs. Don't hold your breath waiting for anything greater than P=1 yrs as P=2 years is the median or middle value in your annual peak values assuming an annual peak flow series.  Once you get to P=1.1 yrs you have reached the lower 10% of your theoretical annual peaks and is nothing to sneeze at.  Frequencies less than 1 year are the flow/1YR-flow fraction where the 1YR-Flow is P=.999.

 

You can use my free Excel menu addin to compute your Pearson mean, std dev, and skew stats and display the probability distribution.  Another source for these Pearson stats are USGS studies published based on area size and even stream width relations.  If you can get the 2-yr (mean), 10-yr, 100-yr points from USGS regressions then just use my Excel add-in to trial an error the std dev and skew. I'm not sure what to do about this but urban culverts and flood plain spreading can really negatively skew these distributions so inflow is not outflow (use skew=-.5 in urbans or zero skew with unlogged stats???)

 

The GetGraphs setup text for Log10 stats at the end could be:

 1 ; 1612 ;Flow;Ensley, AL; 0;1;0; 6335701 ; 9684176 ; 16711680 ; 255 ;;;; 0 ;1;3.392;0.2255;-0.204

where:

 ;1; =Log10 stats

 ;3.392; = mean log10

 ;0.2255; =std deviation log10

 ;-0.204 =skew

 To use unlogged mean and std deviations change the ;1; to ;2;

 

To test your stats you can use GetGraphs menu 'Cumulative' to sum your low flows and if the total is a reasonable annual peak value the P=X yrs displayed should match your Excel probability graph but I've seen zillion years displayed.

 

 

Update 6/9/09—Getrealtime.exe version 1.3.2 has been updated with computation of average area rainfall amounts using NOAA's WSR-88D radar imagery .  These images are provided free on the web and are updated about every 5 minutes.  The NCR files have been relocated. You will need the latest version of GetRealtime.exe 1.3.2. as of 6/9/09.  You may down load just the GetRealtime.exe here and replace the old version in your C:\Program Files\GetRealtime directory.  Requires display with 32-bit colors.

 

 

So what you may say?!!  Well, listen up and learn…

 

Traditional area average rainfall is obtained by rainfall gages located in or near the area of interest and averaging the rain gage amounts using many methods.  If you’re lucky you might find a rain gage nearby or you would have to install and maintain a network or rain gages yourself.   The tools presented here will allow you to create point rainfall rates ANYWHERE in the USA and greatly improve area averaged amounts and best of all IT’S FREE!!!

 

What use can I make of this you may ask?!! 

 

1)—Getrealtime can convert the average area rainfall to runoff and route and combine with other areas to generate a flow record at any point in the USA in real time (uh, maybe).

 

2)—If you’re an agricultural irrigation district and dress like one or even a small farmer that don't, then you can outline your district boundary and compute the 5-minute, hourly, and daily averaged rainfall on your district and adjust your water order accordingly in real time with out the cost of maintaining a network of rain gages.  Combine this with Getrealtime’s Penman-Monteith standard evapotranspiration computation and you are in like Flint.

 

3)—How about Local, County, and Federal fire fighters wanting to maintain a record of recent rainfall of small to large regions for estimating fire hazard potential in real time.

 

4)—Or anyone wanting to maintain a real time rainfall record for anywhere in the USA for what ever reason.

 

To do this you will need 2 new text files for your area of interest, a boundary file and a point file.  The steps for generating these 2 files are outlined here:

 

www.GetMyRealtime.com/GetNexradHelp.aspx

 

You will have to download and install GetNexrad.exe separately from your GetRealtime download.

 

As noted the file naming convention used by GetRealtime is crucial.  These two file names consist of the file type, radar site id, and datatype_site_id.  For example the radar site id is ESX and the DSID is 10215.  The radar site id is displayed on the radar image screen in the center of the radar site navigation arrows.

 

Boundary File Name Example:  NexradBoundaryESX10215.txt

 

Point File Name Example:  NexradPointESX10215.txt

 

These 2 files are placed in your Getrealtime file directory.

 

Once you have generated these two files then you have to add site information to the GetAccess HDB database and to the GetRealtime setup file.

 

The Radar coefficients can now be checked and entered by using the 'Wizards' button on the Select Station List. Just click on your station, if present, and then click 'Wizards'. You will see a complete breakout of all your coefficients for editing.

 

Here is an example of the data required in the GetAccess HDB table Rsite for both runoff (30) and rainfall (10):

 

 

The new crucial information is in fields Parameter_Code and Station_ID.  The Parameter code is the type of radar image being used, either 1-Hour Total code=N1P, Base Reflectivity code=N0R or Composite Reflectivity code=NCR.  NCR seems to be updated more consistently during storm events than N0R but I may be wrong.  1-Hour Total seems preferable anyway.  And N0R is spelled with a zero.  I'm beginning to like the N0R more and more.  I have only been at this for a week.

 

The N1P 1-Hour Total radar images ARE NO LONGER treated differently than N0R and NCR rates. If the GetAccess HDB database has accumulated more than 1 hour of unit values then the interval rainfall amount is computed as RadarNow-(RadarLag1-UnitValueLag60&Interval) else it is simply RadarNow/(60/interval).  I may be on a fool's errand trying to sample an irregular radar time series at a regular interval with lag.  We shall see.  Note: Getrealtime.exe 1.3.9 updated 12/18/09 now treats the N1P rainfall 1-Hour Total just like the N0R and NCR in/hr. It seems to make more sense, but tends to lag the rainfall over the hour.

 

Although Storm Total NTP images would avoid potential error in lag of the 1-Hour Total, their resolution seems to be too grossly graduated at the lower values to be much use for farmers and fire hazards in the West and for runoff from small basins, but I may be wrong again. I will add Storm Total soon for those living in swamps and other backwaters so check back often.

 

Update 8/27/2016: Ridge2 5-minute product PTA storm total hase been added to GetRealtime. The Rtable setup for parameter_code is 'PTA-Ridge2'. The DataType_ID is 9 for total rainfall. It is interesting to compare with N0Q hourly totals to see how it does but lots of early experience with the precip product N1P was a complete dud. PTA uses the new QPE algo so it bares watching. The boundary and point file are the same as N0Q so copy and rename the -10 to -9 part like this:

NexradBoundaryBGM-9301Q.txt

NEXRADPOINTBGM-9301Q.TXT

 

I have updated Getrealtime to let users put there own DBZs to rainfall rates in the setup SHIFT cell like this:

 

NEXRAD-AMX; 10915; Rainfall; Ft Lauderdale N0r; 0;  0.00, 0.00,.02,.04,.09,.21,1,2,4

 (don't forget that 0 goes in the BASE cell and the first dbz=10 to change)

 

Note that not all the upper rates are needed if not revised. I have updated GetNerad.exe (and GetRealtime.exe) to allow users to change the rainfall rates in the same manner by including the optional file 'Dbz2Rainrate.txt' that has just the one line that looks likes this:

  0.00, 0.00,.02,.04,.09,.21,1,2,4,7,10,11,12,13,14

 

With the new Ridge 2 higher resolution products, look at the conversions in the file 'Level2RGB.txt' for selecting or changing rainfall conversion rates.

  

More help examples are available at http://getmyrealtime.com/RainfallComparisons.aspx and http://getmyrealtime.com/SierraSnowfallComparisons.aspx.

  

 

 

The Station_ID in this example is NEXRAD-ESX which tells Getrealtime to use the ESX site radar for imagery acquisition.

 

You might notice that new datatype_id that must be used for Runoff computation is 30 and the datatype_id for rainfall is still the old 10.

 

Here is an example of the data required in the Getrealtime setup file GetRealtime_setup.txt for both runoff (30) and rainfall (10):

 

 

Notice first that the Rainfall for a station is retrieved before the runoff can be computed and so is placed before it in the setup file.  Again the Station_ID in our example is NEXRAD-ESX.

 

Or... version 2.0.1 can now use Station_ID= COMPUTE-Unit like this:

COMPUTE-Unit; 30755; Runoff; Prairie Creek, Dallas, Tx; 0; 2,2.5,0.15,0.2,9.03; P1

 

The HDB database table RSITE would have the Parameter_Code= 10753 and Station_Id= COMPUTE.

 

If your site of interest is too near the radar site and causes false background readings then you can put a lower bound on the radar rainfall values being recorded by entering a shift1 and formula like this:

 

 

Images available for the past 4 hours will be retrieved and processed (24 images in clear mode, 48 rain mode). The easiest batch mode method is to check the checkbox ‘Scheduled Batch’ and GetRealtime will go to sleep between retrievals.  If the batch interval is set to 5 minutes or more then up to all 4 hours of past radars will be retrieved on first retrieval, then just the need latest images.  Radar need not be retrieved when there is no rainfall. Runoff computations will fill in zero rainfall for missing periods of rainfall in the 5-minute record. 

 

 

Does This Radar Stuff Really Work???

 

 Let's compare with an actual point rainfall gage. (This better pan out well is all I know)

 

Example of how to create a Boundary File and a Point file for a 1-Point Wunderground weather station comparison. As above, create a boundary file for a very small area around the weather station location. Only the column of X,Y pairs matters. From here knowing the center coordinates hand edit the XY boundary by hand similar to this (any size is ok):

 

Edit boundary for a better square Area1 X,Y Coordinates (pix)

322, 188

322, 190

324, 190

324, 188

322, 188

Xmin 322, Ymin 188, Xmax 324, Ymax 190

Centroid= 323, 189

 

Now you can use Notepad to create this simple Point file:

 

323,189,323,189

1

 

That's it, just 2 lines. The reason for creating a larger area boundary file is so you can see it on the computer screen and can click on the color within it. The Point file does all the work.

 

The utility program LatLongPixels has been added to more accurately determine the point file value needed given the points Lat/Long values. Wundergrounds weather stations all have their Lat/Longs on their station location screen on the right side of their current daily history pages (the pages with the 'comma delimited download' option. You may download this utility here.

 

I am using these 2 file examples for the Miami AMX radar to compare with the Wunderground weather station KFLCORAL5 west of Fort Lauderdale, Florida. Why this particular station? Because it was somewhat near a highway intersection for locating on the radar image and appears to be well maintained. How would I know? Because I can read maps and also wind speed is a dead give away. The least amount of calm is golden. But most of all, you want some rain to be recorded when it is raining, duh. And the last thing, but minor, is time step. I really like 10-minute and 5 minute time steps that appear regular but it's not critical.  Wunderground stations can be jewels or really bad losers so check them out. I will try to post some data on this Nexrad vs Wunderground example soon... if it works out favorably, otherwise I'm burying it. ;-)

 

For the results of these rainfall comparisons go here.

For an example of how to automate GetRealtime Nexrad radar-runoff and the results of these rainfall-runoff comparisons go here.

 

 

GetRealtime 1.5 updated 1/2/2010 will route and combine runoff and flow hydrograph values in the GetAccess database. To do this start with GetAccess.exe, select DB Tables, Rsite, and add a new line like this:

 

 

The new parameters are Parameter_code and Station_id.

 

Rsite table setup in GetAccessHDB.mdb Access database:

DSID=1xxx

DataType_ID=1

Datatype_name=flow,....

Parameter_code=FlamingoRoutingFile.txt <<< your routing control file name

Station_id=ROUTE

 

Now set up GetRealtime.exe's GetRealtime_setup.txt file like this:

 

Station_ID; DatatypeSiteID; Parameter; Site_Name;

ROUTE; 1214; Flow; Routed Flamingo Wash at Nellis Blvd Flows

 

Some routings with long travel times may need more than 1 day of warmup before writing the values to the HDB. For longer warmup days add the days in base1 setup field like this 7 days:

 

ROUTE; 1214; Flow; Routed Flamingo Wash at Nellis Blvd Flows; 7

 

Place the routing below any of the retrievals or runoff computations in the setup that will be routed.

 

Input time-steps other than 5-minute, Route-xUnit can be used, ie. Route-10Unit for 10-Minute Candian Radar runoff. Or Route-xHour or Route-xDay for hourly and daily runoff.

 

Now all we need is to create the routing control file. The control file consists of the 7 commands:

 

GET (gets flow values from GetAccess HDB database)

COMBINE (adds the current working hydrograph to the last combine hydrograph)

SUBTRACT  0 or 1 (subtracts the current two hydrographs, a 1 value reverses order)

ADD (adds a constant value or hourly to the current hydrograph, negative value for loss, see release)

RELEASE (replaces outflow values less than release in modpul outflow rating or ADD)

ROUTE (routes the current hydrograph using Tatum, Modpul, Muskingum, or Muskingum-Cunge)

DIVERSION  (diverts from the current hydrograph by using a diversion rating and by providing a DSID you could later GET this diversion, route, and combine.)

IF (set modpul rating based on dsid or date each day or skip-n lines at start)

PUT (change output dsid)

WRITE (debug routing output file RouteSteps.txt)

END (end and write current hydrograph to HDB database)

 

As an example text file named FlamingoRoutingFile.txt:

 

Your single Title Line Here is a must.

GET 4 30752 Upper sub Prairie Cr (30752 is the DatatypeSite_ID for Sub Prairie Cr)

GET 4 30753 Lower sub Prairie Cr (4 indicates unit values, 3=hour, 2=day, GetRealtime_setup overrides)

COMBINE

ROUTE Tatum 2.5   travel time in hours

ADD -2.5   reduce all the current hydrograph's values by 2.5 cfs

GET 4 1754 Trib sub dallas river

ROUTE Tatum 2.5   ok to route trib before combine but remember to combine

COMBINE

ROUTE Muskingum 2.0 0.25 3   lag in hours=2 and X=.25 in who knows, cascades=3.

ROUTE Muskingum-Cunge MCungeBeaverCr.RAT

GET 4 1758 Little sub trib

ROUTE Tatum 2.5   ok to route trib before combine but remember to combine

COMBINE

DIVERSION 1888 CulvertDiversionAt56thSt.RAT  dsid=1888, use dsid=0 to not save diversion

GET 4 1757 Left fork sub trib

COMBINE

RELEASE 12000   schedule 12000 cfs constant release from Lake Mead modpul rating

ROUTE Modpul 3777 LakeMead.Rat  you can add optional DSID for writing elevations and startup flow will use this elevation so always use for reservoirs.

RELEASE -1404 -1 get release from rhour dsid hourly

ROUTE Modpul Steephill.Rat 3 Steephill.Rat  3  a 2 or more  for cascading Modpul or 0.01 to 1.99 for storage adjust

IF 0 rday PARAM  SteephillOCT.Rat month(D)>9  (no comments allowed)

ROUTE Modpul Steephill.Rat -1  a -1 will use the optimal N cascades from Modpul 4th column

ROUTE Modpul 3777 Steephill.Rat -2 0.75  a -2 will include daily Seldon KS evap*0.75

ROUTE Modpul 3777 Steephill.Rat -3 0.75 a -3 will include daily Seldon KS evap*0.75 AND start on actual starting day, not at beginning of runoff. Use this if you have actual elevation history.

RELEASE -1401 -1 RDAY get release dsid hourly and yesterday mean for projected

ADD -1   remove release value times -1, use +1 to add

PUT -1102 change output dsid, may be needed for overwriting USGS gage

END

(Any notes after the END command will be ignored)

 

Note that after the initial two GET's, there after, every GET a COMBINE must follow at some point. To make your routing easier to follow, try to combine as soon as possible. For instance you cannot be routing two streams at the same time so use COMBINE first.  The current or working hydrograph to which ROUTE, ADD, and DIVERSION are applied is determined by the last COMBINE or last GET, which ever was last.

 

Also note that while GetRealtime is running in Batch Mode, unlike the GetRealtime_setup.txt file values, any changes made to the routing files will take affect on the next run. So using the Routing Wizard you could change Releases, Adds, and such in the routing files with out restarting GetRealtime.

 

Update 6/21/2016: If you have a reservoir in your basin like a USBR Pacific Northwest dam with near real-time lake levels and river and canal releases info then you can use a MODPUL routing with out resorting to shelling a reservoir model like Hec-RESSIM. This allows forecasting the next 7 days of river/spillway releases with the NWS forecast for runoff with just a simple Modpul.  Also note a reservoir rule curve is just another Modpul rating and can be used per Date with an IF check .

 

EXAMPLE Modpul Routing File:

 

Route Subs 8,9,10 runoff thru Emigrant Dam

GET 4 -1108 Subs 8-9-10 inflow

RELEASE -25401 -1 RDAY get each hour but yesterday mean for projected

ROUTE Modpul 3401 EmigrantDam.RAT overwrite Hydromet elevs in 3401

RELEASE 25401 -1 use Hydromet canal hourly and last hourly for projected

ADD -1 use release *-1 to remove canal for river release

END

 

The hourly 'RELEASE' can be used by both 'MODPUL' and 'ADD'. You could also hand enter hourlys in  rhour values in rhour table to schedule RELEASE values manually. When you write '3401' Modpul elevations, then GetRealtime knows to get starting elevation from database.  Note small reservoirs without release info can use MODPUL just as well without RELEASE with a wel thought out Modpul rating and is how normally used.

 

EXAMPLE Lagged Release to use last weeks values hourly or dailys for projected:

 

Subs 8+9+10 as combined inflow into Emigrant Dam

WRITE

GET 4 -30109 Sub 9

RELEASE 1403 -1 RHOUR-7 Green Springs Powerplant bottom Sub 9

ADD +1 add

RELEASE 1405 -1 RDAY-7 West Side Canal

ADD -1 remove

GET 4 -30108 Sub 8

COMBINE GET 4 -30110 Sub 10

COMBINE

END

  

The 'IF' can be used with 'MODPUL' to set the rating file name each day like this:

 

Route Subs 8,9,10 thru Emigrant Dam

GET 4 -1108 Subs 8-9-10

RELEASE -25401 -1 use Hydromet hourly value for today and projections

IF 0 rday PARAM EmigrantDamJAN.RAT month(D)=1

IF 0 rday PARAM EmigrantDamFEB.RAT month(D)=2

IF 0 rday PARAM EmigrantDamOCT.RAT month(D)>9 and month(D)<=12

ROUTE Modpul 3401 EmigrantDam.RAT use defualt Rat if not triggered above

IF -30103 rday SKIP-1 flow P1>10

ADD 5

END

  

All routings start a day early for warm up before writing the requested start date values so something to be aware of.

  

Update 7/7/2014: Muskingum routings can use cascades by just adding the number of cascades as shown above.  The wedge storage X factor can vary from 0.5 for no storage to negative values. Start with X=0.2 and if cascading try some negative X values.  Cascades have the effect of non-linear Muskingum coefficients, so they say, and much easier to calibrate.  My free ModPulXsec.exe on my More Stuff page can compute the Muskingum-Cunge K & X and number of cascades using the SCS TR-20 methods for constant parameters based on peak flow.  The Muskingum-Cunge rating file produced by my ModPulXsec.exe shows the format for using a constant parameter Muskingum-Cunge routing based on the peak flow look up.

 

GetRealtime 2.4.0 Modpul routing can now perform a cascading subdivision of your rating by dividing your rating storage values by the number of sub-divisions and then loop the routings. This has the effect of increasing peak outflow as N increases. To perform a Modpul sub-division, just enter a 2 or greater value after the rating filename as shown above. For more background on cascading Modpul and other routings read Heatherman's pdf info here.  (Another tip of the hat to David in Alabama.)  Also a Modpul cascading subdivision value entered between 0.01 to 1.99 will be used not as a subdivision but as a modpul storage adjustment factor. Example, if 0.6 is entered then the modpul rating storage values will be multiplied by 0.6. This makes calibrating routings much easier and may be just what you needed all along instead of cascading for a larger peak. I wonder if they ever thought of that?

 

To calculate the number of cascading reservoirs to match a full unsteady flow solution, the pdf author says:

  N=(W*L/K)*Z/Q =number of cascading reservoirs

where:

W=cross-section top width, ft

L=entire reach length,ft

K=travel time as dVol / dQ, seconds

Z=fall over the reach, ft

Q=flow, cfs

Use Hec-Ras or my free ModPulXsec.exe to compute these factors at a typical cross-section.

Ideally you would select a flow at 2/3 the peak inflow hydrograph, but I guess a Modpul-Heatherman could make this adjustment on the fly for each Modpul file line: elev, flow, storage, N.

 

Update Nov 18, 2013--I have added the optimal N number of cascading reservoirs to my free ModPulXsec.exe (see More Stuff page). Now if you add a -1 for the number of cascading reservoirs Modpul will now get the 2/3 of the inflow peak and lookup the optimal number N reservoirs:

 

ROUTE Modpul Steephill.Rat  -1  a -1 will use the optimal N cascades from Modpul 4th column

ROUTE Modpul Steephill.Rat  -1 0.5  with a 0.5 peak lookup adjustment

if the 0.5 is included then instead of 2/3 peak flow lookup a 0.5 peak flow will be looked up for optimized N.

 

Muskingum K=3.0, X=0.3, cascades=12:

Muskingum with cascades can be nearly as good as Modpul, but for the number of optimized cascades you will still need to figure that out. I just used the Modpul's optimized number from its average cross-section computations. The advantage of Modpul then, is you do not need to know K travel time or X storage factor, but if you have an outflow hydrograph then these are easily fitted. Or even if not, my free ModPulXsec.exe computations can give the constant parameter Muskingum-Cunge estimates. Better to estimate a cross-section and use Hec-Ras unsteady flow routing with interpolated x-secs and see where your Modpul or Muskingum should be heading and if it won't work then automate Hec-Ras unsteady flow to do your routings for these difficult reaches.

 

Below are comparisons with Muskingum-Cunge calculated K, X, and cascades.

I have found HEC-Ras unsteady cannot route a mountainous 55 ft/mile slope channel but both ModPul and Muskingum seem to do fine.

 

  

The 'RELEASE' command will replace the next modpul outflow rating values above 0 (0 is needed) with the Release value if less than the release. This will produce a constant release until storage is above this value or empty.  The Release command allows an easier change in schedule than redoing the whole modpul rating.  The 'Release' command can also preceed an 'ADD' command as shown above.  The Routing Wizard will allow easy access to the routing files for easier changes like this.

 

If your basin has more than one main channel, then you will need to write a separate control file for it and save each channel in the HDB database before combining with the last control file. Remember to try using my ChannelStorage.exe (trapezoidal) or ModPulXsec.exe (x-section) available here to create the Modified Puls channel rating or velocities for Tatum and Muskingum peak travel times.

 

ModPuls Rating example:

 

Diversion Rating example of 2 columns for Qin and Qout where Qin is flow in the channel at the diversion point and Qout is the diversion:

The diversion record can be saved with a DSID and then returned using MODPUL rating or some math function based on other parameters. Not sure how you would do this so you might think about it.

 

Update 1/18/2014--To change a 15-minute USGS flow time step to 5-minute steps add ', 5-min' on to the USGS station number on the GetRealtime_setup.txt:

 

094196781, 5-min; 1211; Flow; Flamingo Wash at Nellis Blvd

 

This allows adding a forecast to the USGS gage which the USGS gage can overwrite as the observed data becomes available. Why?... because now you can route and combine gaged flows with other sites 5-minute time-steps.

 

Update 12/18/2013: Runoff computations done with datatype_id=30 when run by the number of past days now checks for a recession runoff on or after the start of computations and if it is then the starting day is moved to the day of the starting runoff plus 1 day. This insures you are not cutting off a previous recession. The start and end of runoff are now stored in the table 'rupdate'. The earliest starting day for the run is then used for all subsequent routings including the HEC-Ras and HEC-Hms shell routings. Runoff computed by historical dates do not store anything in the table 'rupdate' and do not self adjust starting date.

  

 

GetRealtime 1.5.1 updated 3/18/2010  now supports NEXRAD Radar images for NVL (vertical integrated liquid), NET (echo tops), N0S (storm relative velocity) and N0V (base velocity) as well as the N0R, NCR, and NTP rainfall conversions for the scientifically minded. Although a screen area averaged NVL and NET make little sense, the maximum value on the radar screen can be used as an early warning for hail storms and other exciting things. To determine the maximum value on the Nexrad radar image a new Datatype_ID has been added and is 32 to indicate a radar image maximum is wanted. See list of Datatype_ID's above. Remember valid SDID integers are -32,768 to 32,767 so your Site_ID is limited to 1 to 767 instead of 1 to 999.

 

Use GetAccess to set the HDB rsite table as follows:

Now add the site to GetRealtime_setup.txt using either Notepad or GetRealtime.exe itself like this:

 

Now we need a Boundary file to determine what points will be checked on the radar image. The boundary needs to cover the 143 mile radius around the radar site and so using Notepad the Boundary file will be a rectangle like this named NexradBoundaryAMX32001.txt:

 

Miami AMX Full Radar Area

10,15

10,534

589,534

589,15

10,15

Xmin 10, Ymin 15, Xmax 589, Ymax 534

Centroid= 299,274

  

Note that Ymin and Ymax can be shrunk at higher latitudes and you can trim some fat off the rectangle by adding 0.292 and 0.708 of the height and width for an octagon like this:

 

Miami Full Radar Octagonal Area

10,167

10,383

179,534

420,534

589,383

589,167

420,15

179,15

10,167

Xmin 10, Ymin 15, Xmax 589, Ymax 534

Centroid= 299,274

  

Lastly all that is needed is the Point file.  The Point file is created using GetNexrad.exe. Fire up GetNexrad.exe and select 'Create Point File' and select the Boundary file name located in the GetRealtime folder above like this:

Select 'Create File' and answer 'NO' when prompted 'Do you want to create a single point file?'.

  

The 580 x 450 boundary will contain 261,000 points in the point file. Luckily the pixel size for the radar NET and NVL data points is about 5 pixels by 5 pixels for the 2.5 x 2.5 mile coverage so only every 4th pixel needs to be checked reducing the actual work to 16,312 pixels... child's play. On the other hand if you want the maximum for a NCR image all 261,000 pixels will be checked... ouch!

  

The maximum N0S and N0V velocities are checked every other pixel or about 1 mile apart. The storm relative velocity maximum difference is calculated as the difference between non-zero values of opposite signs (-inbound and +outbound velocities) and can exceed the max 50 knots and 99 knots. Nearly all N0S images plain maximum velocity are near the maximum value of 50 knots and so is of little use hence the difference calculation. Green next to Red, means your dead... or its birds and bats and business as usual... boy those guys at headquarters keep us hoppin.

 

We are all set to automate GetRealtime in batch mode as discussed above. You may wish to save radar gif images with maximum values above a certain value by checking the Save Radar box and be alerted by a beep by checking the Beep box. I normally run GetRealtime in batch mode like this:

  

The 'Delete Temporary Files' when checked will delete temporary internet files and history that Internet Explorer saves in it's temporary files location, BUT NOT any gif image files you wanted saved in the normal download location. Normally it is best to check this box.

 

If a Gif file download is timed out, GetRealtime will try to down load the gif again before moving on.

 

  

Below is an Echo Tops NET radar image as shown by GetNexrad and the octagonal area maximum value 15 is shown in the window caption as well as the point I clicked on:

 

Here is an N0S max velocity diff example:

 

 

Radar gif images may be saved to file when values exceed the value entered in the 'Base' field as shown below and will over ride the value entered on the GetRealtime main screen when 'Save Radar' is checked. If the base value is below 0.01 it will be used to remove rainfall values that are caused by ground clutter as described above. Ground clutter is removed on the NCQ and A2M radar images for you.

 

If you would like to save all the 32 datatype radar images at a site based on the first of a series being saved then enter '0' as 'base1' for the subsequent like this:

 

Note: NET and NVL have been discontinued.... "As part of troubleshooting some RIDGE delivery problems, we discontinued these two products as part of troubleshooting. The products will restart when RIDGE2, version2 is deployed. This is not likely until after the start of 2012." ~NWS ROC~ ...but are currently available on the Ridge2 Testbed below.

 

 

RIDGE 2 TESTBED

 

GetRealtime.exe has been updated to read the new 1000x1000 pixel PNG files available on the Ridge 2 Testbed. The new higher resolution DBZ product N0Q has 0.5 dbz resolution compared to the old 5 dbz resolution and should be your base reflectivity product of choice.

 

New GetNexrad 3.6.0--Iowa State University Mesonet historical images can now be directly loaded for a loop of up to the past 24 hours for Ridge2.  Historical radar images beginning Jan 1, 2012 can be downloaded for 24 hours, 1 day at a time.  Images download at about 6 hours per minute.

  

Update 12/9/2014: I added a 'Max radar hours' to the GetRealtime.exe batch menu. If you haven't run the radar in a few days it will get up to the past X hours of radar. So if you run your radar once a day and want ALL THE RADAR since yesterday, set the 'Max radar hours' to 48. It will get all the radars since the last time it ran yesterday. I wouldn't recommend getting all the radars but you could. And you don't have to actually run in Batch mode. Just check the Batch box, set the max hours, then UNcheck the Batch Box and run. And apparently Iowa now allows at least 48 hours per download. I just tried it. So don't think you have to run GetRealtime exactly at the same time each day. Even if Iowa goes back to a 24 hour limit, then run it twice a day. But I wouldn't. This will come in handy if you know it started raining some time last night. You could use History but History doesn't set the last radar retrieved time.

  

  

Your old boundary files can be upgraded to the new Ridge2 world files using GetNexrad.exe and is as simple as choosing the short range or long range N0Q product.  Although as of 8/8/2012 GetNexrad no longer needs to covert boundary's between Ridge 1 and Ridge 2 and KMZ, GetRealtime must use Ridge version specific boundaries. All GiF images are Ridge 1 and all PNG images are Ridge 2. The conversion here is only one way from Ridge 1 to Ridge 2.

 

With the new N0Q 0.5 dbz resolution, the new file "Level2RGB.txt" allows on the fly choice of rain rate type conversion such as Convective 0, CoolWest 1, CoolEast 2, Tropical 3 and Semi-Tropical 4. To tell GetRealtime which rain rate conversion type to use simply put a 0, 1, 2, 3, or 4 in the GetRealtime setup file Base1 setup cell or leave blank for Convective.

 

In the GetAccess Database the new Parameter Code is the product code with "-Ridge2" attached like N0Q-Ridge2.

 

Remember Ridge2 is a Testbed and may change so if problems occur check back for updates to GetRealtime.exe.

 

For much more info on the Ridge2 products, see the GetNexrad Help web page.

 

 

Shifting Retrieved Times to Your Time Zone

  

Update 10/19/2011 allows GetRealtime retrievals to be adjust for any time value or time zone. The GetAccess 'rsite' table now has a field 'time_shift' which you can enter the hour shift. To adjust a Wunderground station in Florida (ET) to your home in California (PT) then you could enter a -3 for the 3 hour time zone difference in table rsite field 11 ('time_shift'). This makes comparing Nexrad radar stored in your time zone easier.  If you add this field 11 to your database table 'rsite' it is a double precision floating point.

  

One caveat to this is computations of Solar and ET is when you are using COMPUTE instead of the Station_ID. You then need to use your standard time meridian, which is 15 degrees per hour. Central to my Pacific time would change solar noon from -90 to -120 for the 2 hour diff so you have to fix your GetRealtime_setup if you use the COMPUTE method.

 

Daylight Savings Time change is ignored. Spring forward will leave an empty hour between 1 am to 2 am and Fall back will over write the 1 am to 2 am unit values.  Hey, you voted for these knuckle heads.

 

  

Snowpack and Snow Melt

 

Update Dec 2, 2020: I have added downloading modeled snow data from NOAA's National Operational Hydrologic Remote Sensing Center's Interactive Snow Information (see below):

 https://www.nohrsc.noaa.gov/interactive/html/map.html

 

Ive also started a comparison of GetRealtime's snowmelt with adjusted radar to the NOAA Snowcast modeled data here: 

http://getmyrealtime.com/SnowmeltComparisons.aspx

 

(Update 12/3/2011)

GetRealtime.exe 2.2.1 has been updated to compute a continuous year around simulation of a point snowpack accumulation and melt. The melt (output as a rainfall parameter) can be input to Getrealtime's rainfall-runoff for a real-time continuous simulation through out the year.  And yes, the snowmelt calculation will give the correct precip even in July... in Death Valley.

 

GetRealtime has adapted the methods of the physically based point snow surface hourly model ESCIMO (Energy balance Snow Cover Integrated MOdel) (Strasser et al., 2002) for hourly, 5-minute, or daily time steps for simulation of the energy balance, the water equivalent and melt rates of a snow cover. ESCIMO uses a 1-D, one-layer process model which assumes the snow cover to be a single and homogeneous pack, and which solves the energy and mass balance equations for the snow surface by applying simple parameterizations of the relevant processes... like I would know.... it's just science.

 

Update 9/20/2012--The Snowmelt coefficients can now be checked and entered by using the 'Wizards' button on the Select Station List. Just click on your station, if present, and then click 'Wizards'. You will see a complete breakout of all your coefficients for editing.

  

Example GetAccess 'rsite' table for snowpack:

 

The station_id SNOWMELT parameter_codes order must be strictly followed but the site_id part of these datatype_site_id's are up to you. I would assume you would want to use a Wunderground gage for these first five inputs but compute the output for another site_id meaning the last four datatype_id 's 10, 23, 25, and 31 can belong to your real point of interest. This example computes the snowpack for the Wundergage site but with Nexrad Radar precip so has a different input precip site_id.

 

In the station_id SNOWMELT output precip datatype_id code is 10 for rainfall, which I named rain/melt here and has the parameter_codes:

 

17411,18411,28411,29411,-29411,10400,-23411,-25411,-31411

Temp. 17

.....,Humidity 18

...........,Windspeed 28

.................,SolarRadiation 29

.......................,LongRadiation -29

..............................,Precip Input 10  (the output Rain/Melt is this whole rsite record)

....................................,Output SnowWaterContent -23

...........................................,Output SnowAge -25

..................................................,Output Albedo -31

 

Albedo output has been added because the starting value really cannot be calculated from just Snow Age because of it's variable recession coefficient history. And as it turns out, it seems pretty important so needs looked after.

  

 Example GetRealtime_setup.txt: 

Station_ID = Snowmelt-hour or Snowmelt-unit (yes, 5-minute snowmelt!)

 

The shift1 cell is used if you would like to adjust the 6 input parameters above and the 7th adjustment is for Albedo decay rate (<1 to slow down ripening but this decay factor * (-0.12 or -0.05 ) only applies if snow age is greater than 7 days). You may leave shift1 blank or just add up through your parameter of interest. For example to add 3.5 degrees F to the Wunderground input only for a different elevation of interest, then only the first +3.5 is needed. I'm really not sure how one would adjust say Solar radiation for north or south exposure but you can change the multiplicative factor 1 here following the strict parameter order above like +3.5, 1, 1, 0.8, 1, 2.1, 1, 35.6

 

The ending 35.6 is the rain/snow temperature F phase which corresponds to 2 degrees kelvin above freezing.  When kicking off a season with warm soil you may want to lower this to 32F or even lower to get runoff.   Air temperature determines if the precip is snow or rain and is also used in the heat transfer calculations and so it is probably a good candidate for first adjustments... then probably longwave radiation. The albedo can be adjusted after 7 days for swiftness of the spring melt on larger snowpacks or slow down long winter dry spells.  When there is snow on the ground, runoff losses are reduced 15% to simulate frozen soil.

  

On 2nd thought, if you change any of the first 3 (Temp, Humidity, Wind) for use at your real site of interest, then you will need to recompute Solar and Longwave Radiation, which can be done, but I need to make a Temperature change as painless as possible. Based on my regression formula at this elevation, a 3.5 F Temperature change would cause a 3.4% change in Longwave Radiation over the year or about 1% for each degree F.  This probably means that a 1000 ft elevation lapse rate of -3.5 F change should be nothing to worry about for radiation. I already have found the Longwave Radiation factor needs to be less than 0.5 in the Sierras to keep the current snowpack from completely melting.  (not to worry, see graph below)

 

Update:  I suggest Shift1 as:   0, 1, 1.15, 1, 1, 1, 1, 34

Based on NWS Snowcast info and modeled runoff at several sites around the US. The 1.15 windspeed adjust is an assumed 12' height gage corrected to 30'. The melt calculations use 1/10 this speed. Wind is important because of the large sensible heat flux it figures into.

 

Changing the input Precip is ok, ie., 2.0 for 2.0 times the N0Q radar precip at your site or drainage area of interest.  And if you add a '0' after the rain/snow 35.6  factor then the computation steps will be printed to the file 'SNOWMELT.txt' which you can paste/open with  Excel to read.  After the novelty wares off, remember to delete the 0.

 

Input Example GetRealtime_setup.txt:

KTRK; 17411; Temperature; Truckee, Ca Airport

KTRK; 18411; Humidity; Truckee, Ca Airport

KTRK; 28411; Wind Speed; Truckee, Ca Airport

KTRK; 29411; Solar Radiation; Truckee, Ca Airport; 0; 39.320,-120.140,-120,5900; P3 (*computed*)

KTRK; -29411; Long Radiation; Truckee, Ca Airport; 0; -4.55 (*computed -4.55F temp adj optional*)

KTRK; 10411; Precip; Truckee, Ca Airport

(Output)

SNOWMELT-hour; -10411,-23411; Precip,swc; Truckee, Ca Airport; 0; +0,1,1,1,1,1,1,33;

(note snow age datatype_id 25 and albedo 31 are also input/outputs as is snow water content 23 for start up.  Update 12/21/2012: If you run out of datatype_id 10 for precip/melt then you can now use datatype_id 11 interchangeably with 10.)

 

When using snowmelt for GetRealtime runoff computation, the presence of snow will be flagged (source 112 instead of 12) and runoff computation will reduce melt infiltration by 30% (0.7 Snowmelt Loss Factor) for frozen soil. Also, ET will be set to zero. For your spring melt runoff computation you could also MANUALLY set your 'Groundwater Loss Factor' to zero and its 'Groundwater Recovery Adj' to 3 or so. Remember to reset these later to summer conditions.

 

One thing to note is that unlike I would have thought, it is not sunshine that melts shiny new snow so much as the longwave infrared temperature of the clouds and humidity and what ever else is up there... those heat  fluxes... beyond the horizon of perception.  Ok, that is a lot of scientific propaganda, everyone knows snow melts in the day. ;-)

 

Some tips on Long Radiation: The termperature lapse rate is 3.5 F per 1000 feet. Now you could for example compute each 1000 ft snowmelt band long radiation with a temp adjust like shown above. Or you can compute the long radiation at the weather gage and use a snowmelt long radiation adjustment factor of 3% per 3.5 F or 1000 feet. The 3% comes from GetRealtime's regression on the Escimo long radiation data using T, H, and Wind Speed and it has a standard error of 24 watts/m^2: You could use a setup to compute your own long radiation like this guy's: "Parameterizastion of incoming longwave radiation in high-mountain enviroments" L=5.67E-8*(T-21)^4+.84*H-57 ; where T= Kelvin and H= % Relative Humidity On my Escimo radiation gage winter record this has a standard error of 31 watts/m^2 and I had to change the constant -57 to -20. I then get similar results either method  NWS 7-day forecast now includes 'cloud-amount' data_type 26 and should be added to the parameter codes (see above computations)..

  

If you have used GetNexrad to enter multiple 6 or 24 hour QPF forecasts then yesterday through today's rtable weather values will be used as needed for snowmelt.

 

Update 12-23-2013: Better than using GetNexrad for snowmelt, GetRealtime will now retrieve the NWS forecast for precip, temp, humidity, windspeed and cloud cover for the next 7 days for a lat,long. Because these values are not being retrieved with the Wundergage site code you will have to change the GetAccess table 'rsite' to use DSID's for computations of solar, long, and ET. Just replace the Wunder Codes with the DSID's in the proper order. Also, like runoff computations, snowmelt will then run up to the 7 additional future days when run by number of days. This is really slick, automated forecast of snowpack and melt for 7 days! Here is a GetRealtime_Setup.txt example (The -2 is my time shift from Central to Pacific):

If you are wondering why the redundant time shift in the setup file, you are right about temp, humidity, and windspeed having the time shift in table 'rsite' already but note my precip is a computed gage adjusted radar and computed values will not or better not have a time shift in table 'rsite'.  Also see the 'Shifting Time Zones' caveat just above. The Solar and ET setup now needs to use your standard time meridian.

  

Using Unit Values:

SNOWMELT-unit

 

When using Nexrad Radar 5-minute values only the period for actual precipitation is needed. You do not need to keep the radar record going all the time. When the Nexrad 5-minute values are present the weather gage input Temp, Humidity, Wind, Solar, and Longwave are used at 5 minute time steps and output will be 5 minute time steps. When Nexrad 5-minute values are missing, zero precip is assumed and the output unit value time step will be the erratic time steps of the Wunderground gage inputs. Add a zero to the 8th shift1 cell factors as described above to see the full computations output for how this works.

 

Using Day Values:

SNOWMELT-day

 

Daily's seem to work also. You may want to investigate any changes to factor adjustments that would be needed if any.

  

Using Elevation Bands:

In mountainous basins you can use Google Earth to see what fraction of your subbasin is in elevation zones and compute each zone separately and sum like this (notice ONLY the elevation band AREA is changed in the runoff coefficients):

 

******; *******; *******; ********Elev Bands*************

SNOWMELT-unit; 11492,-23492; Melt, Swc; 4000FT Burro Cr Sub 2 Snow Melt & Pack, AZ; 0; 3.5, 1, 0.75, 1, 0.9, 1, 1, 34

SNOWMELT-unit; 11592,-23592; Melt, Swc; 5000FT Burro Cr Sub 2 Snow Melt & Pack, AZ; 0; 0, 1, 0.75, 1, 0.9, 1, 1, 34

SNOWMELT-unit; 11692,-23692; Melt, Swc; 6000FT Burro Cr Sub 2 Snow Melt & Pack, AZ; 0; -3.5, 1, 0.75, 1, 0.9, 1, 1, 34

******; *******; *******; *********Runoff Bands************

COMPUTE-unit; -30492; Runoff; 4000FT Burro Creek Sub2 Esx; 0; 3.18, 50, 0.1, 0.01, 39.672, 1, 0.1, 0.3, 3, 0.3, 6, 85, 50, 0.99, 0.7, 5, 0.7, 0.1, 0.05, 1.00, 0, 0.1, 1, 0.7, 0, LagV; P1; 0; Triangle Unit Graph

COMPUTE-unit; -30592; Runoff; 5000FT Burro Creek Sub2 Esx; 0; 3.18, 50, 0.1, 0.01, 98.154, 1, 0.1, 0.3, 3, 0.3, 6, 85, 50, 0.99, 0.7, 5, 0.7, 0.1, 0.05, 1.00, 0, 0.1, 1, 0.7, 0, LagV; P1; 0; Triangle Unit Graph

COMPUTE-unit; -30692; Runoff; 6000FT Burro Creek Sub2 Esx; 0; 3.18, 50, 0.1, 0.01, 33.174, 1, 0.1, 0.3, 3, 0.3, 6, 85, 50, 0.99, 0.7, 5, 0.7, 0.1, 0.05, 1.00, 0, 0.1, 1, 0.7, 0, LagV; P1; 0; Triangle Unit Graph

COMPUTE-unit; -30362; Runoff; Burro Creek Sub2 Esx; 0; 0; P1+P2+P3

 

Parameter values and constants used (updated 12-3-2011):

Parameter/constant  Symbol  Value  Unit

Soil heat flux B 2.0 Wm−2

Minimum albedo amin 0.50

Maximum albedo (amin+aadd) 0.90

Recession factor (T > 273.16 K) k -0.12  (32.0 F)

Recession factor (T < 273.16 K) k -0.05

Threshold snowfall for albedo reset 0.5 mm running sum for a day (0.02")

Threshold temperature for precipitation phase detection Tw 275.16 K (35.6 F)

Emissivity of snow 0.99

Specific heat of snow (at 0 C) css 2.10×103 J kg−1 K−1

Specific heat of water (at 5 C) csw 4.20×103 J kg−1 K−1

Melting heat of ice ci 3.337×105 J kg−1

Sublimation/resublimation heat of snow (at –5 C) ls 2.8355×106 J kg−1

Stefan-Boltzmann constant 5.67×10−8 Wm−2 K−4

and... just so you know I know, 1W/m-2 =2.065 Langleys/day... geez I hope.

 

Also see my ongoing snowfall study at:

Nexrad Snowfall Comparisons in western central Sierras, CA

 

  

NWS Snow Model SWE and Melt, Depth, Temp, Snowfall & Rainfall

 

You can download HOURLY SWE and Melt forecast data of NOAA National Operational Hydrologic Remote Sensing Center national (and southern Canada) snow model. You determine a gage point (MADIS, CoCoRa, Lat/Longs, etc) and that's all you need. You can use this interactive map to somehow find available Snow Model station ID's. Turn off/on Stations and Cities and their Label to find your way around:

 https://www.nohrsc.noaa.gov/interactive/html/map.html

 

Then you can enter your Station SHEF ID in the upper right box here to verify data is available (no snow cover will look like no data):

 https://www.nohrsc.noaa.gov/interactive/html/graph.html?units=0&region=us&station=CAN-ON-510

 

 I'm not sure how one would use the Melt info in a runoff model. Does it include current rainfall or not. I'm going on the assumption that if there is SWE values then rainfall is ignored, if not use your supplied rainfall... but I'm guessing. Date_times are adjusted to your PC's time zone.

 

Your GetRealtime_setup.txt would look like this:

SNOWCAST-NWS,CAN-ON-510; 11450, 23450; Forecast Melt, SWE; Ontario, CA

 

 Where 'CAN-ON-510' is the 'Station SHEF ID' for data model point.

 

For table 'Rsite' you can use any of your other rain/melt and SWE DSID's or you can add new ones. There is nothing used from table 'Rsite' other than the DSID's and unit_names for Melt=11450 and SWE=23450.

 

Table 'Rsite' setup:

361; 11450; Snow Point Ontario, CA; 11; melt; inches; inches; xxxxxxx; xxxxxxx; 3; ON; 0

 361; 23450; Snow Point Ontario, CA; 11; SWE; inches; inches; xxxxxxx; xxxxxxx; 3; ON; 0

 

Where xxxxxx are anything that might be used by other computations but don't leave blank.

 

Let's say we have an Rtable adjusted radar basin rainfall with NWS precip forecast values for a location for use in runoff modeling:

361; -11444; Sub 1 Ontario Basin; 11; adj radar rainfall; inches; inches; -31200,-10444; COMPUTE; 3; ON; 0

 

Let's now combine the adjusted radar rain with the NWS snow melt. The GetRealtime_setup.txt:

COMPUTE-Hour; 11453; Rain/Melt; Sub 1 Ontario Basin; P1=0; 0; P3; P1>0; 0; P2

 

and table 'Rsite':

444; 11453; Sub 1 Ontario Basin; 11; rain/melt; inches; inches; 23450,11450,-11444; COMPUTE; 3; ON; 0

For the calculations there are 2 sets of 'base', 'shift', 'formula'. The first is used if DSID 23450 swe is 0 else the 2nd formula is used if 23450 >0.

Better yet, if no heated rain gage for radar adjustments, you could use 'rainfall' from the NWS Snowcast of Rainfall and Snowfall, see below.

 

Another option MIGHT be to assume the NWS SWE value is more correct than your gage adjusted radar precip GetRealtime snowmelt calculated SWE values. So one might first run the setup for the Melt and SWE of the NWS forecast on top of the same GetRealtime's snowmelt DSID values, then lower in the GetRealtime setup run the GetRealtime snowmelt computions and overwrite the NWS forecast that way using the NWS starting SWE. This method would automatically produce Melt that actually included rainfall percolating and melting the snowpack. But I'm still guessing here which migh be better. Radar precip is notoriously off in winter for snowpack accumulations because of radar overshoot. But I'm wondering if the NWS snow model is using the same A2M multi radar multi nsensor precip product that I'm using in which case I might see little differrence in snowpack accumulation but NWS could be better at melt? Luckily I live in the desert.

 

For NWS modeled Rainfall and Snowfall the GetRealtime_setup.txt line would look like this:

 

SNOWCAST-NWS,KS-KM-4; 10450,-10450; Forecast Rain, Snow; Ontario, KS

 

where the first DSID 10450 is the Rainfall and the other is the Snowfall.

If you would like to sum the two values on retrieval use a setup like this:

 

SNOWCAST-NWS,KS-KM-4; 10450,-10450; Forecast Rain, Snow; Ontario, KS; 0; 0; P1+P2

 

Only the first DSID 10450 is used to store the combined values but the 2nd is needed as a flag that you want both for the sum.  Neither Rainfall nor Snowfall will be stored in the database.  The Rtable paramaeter_code should have 'Rainfall,Snowfall' entered and will automatically be set using a right click on the selected GetRealtime_setup.txt station line.

 

Just like rainfal, snowfall the Snowcast can provide Air Temperature and snow surface temperature like: surface temperature like:

SNOWCAST-NWS,7091B_MADIS; 17450,-17450; Air Temp, Surface Temp; 7091B_MADIS Oneida, NY

 

Also, for snow DEPTH and DENSITY use data_types 21 for Depth and 13 for Density:

 

 SNOWCAST-NWS,OH-GR-28; 21450,13450; Depth, % Density; Beaver Creek, OH

 

 

 

Multiple Model Runs with Multiple Traces

 

GetRealtime 3.0--If you would like to save multiple scenarios and variations on any of the computations you can by simply attaching 'Run-1-1' or 'Run-1-2' to the GetRealtime_setup.txt file's 'Station_ID' like 'RUN-1-1-COMPUTE-unit'.

 

If you would like to use these 'Run' data as input for further 'Run's then use 'MRUN-1-1-COMPUTE-unit'.

 

What the Run tells GetRealtime is that the results will be written to the GetAccess database 'm' tables instead of the usual 'r' tables. 'M' being for modeled data and 'R' for real data. The new 'm' tables have fields for a Run_ID number and a Trace number. Having both a Run ID and a Trace ID may be overkill but both are available if needed. If you use the same 'run_ID' and 'trace' value all the values will be deleted and replaced when ran. So maybe having multiple 'run_ID's isn't so far fetched after all.

 

In addition to your normal rtables output, if you would like to save what is being computed or forecast then instead of 'RUN-1-1-COMPUTE-unit' you could use just 'RUN-Compute-unit'. The mtables will also be populated with the same data using the 'run_ID' = julian day and the 'trace_id' incremented when run.  This could be handy for keeping multiple NWS-Forecasts during storm events to see how the forecasts panned out.  GetGraphs can add these Runs as traces for display.  To save a run only a few times a day in batch, then use something like "Run-1,7,13,19-FORECAST-NWS... This will save to the Mtables 4 times a day at hours 1, 7, 13 and 19.  The comma separating the hours is important so you have to save at least two times a day. The forecast will still be updated as usual in Rtables.

 

Now to use these 'Run' data as input for further runs use 'MRun-1-1-' instead of Run-1-1- to both read and write from the mtables.  No change to the 'rsite' table is needed.  Mrun for Compute, Snowmelt, and Route will use the mtables with these limitations:

 

Limitations on Mrun input:

Compute Runoff:

    uses default ET or rtable for ET

Route:

    uses the same routing control file

Snowmelt:

    only uses precip from mtable other parameters from rtable

    startup swc, snow age, and albedo if any from rtable

 

To view your 'm' table values in GetAccess, use the check box 'M_Tables' in the upper right of screen. Selecting your data uses the same steps as 'r' data beginning with Run selection, then Trace. I am sure refinements to all of this will be needed so check back often.  In fact, you may have just stepped into a Monte Carlo or an Index Sequential. The tables are rigged and ready.

  

 

Try the Runoff Wizard's 'What If's' for your site to automate how all the parameters affect runoff. Select 'Save to M_Tables' so you can send them to Excel for comparisons as all traces or by dsid's. Don't be afraid to try it all you like because I have automated Run_ID deletions.

  

  

Update 1/12/2013 Stochastic Hydrology--In addition to the % increment method, GetRealtime now has 3 Monte Carlo methods for varying any of the runoff coefficients and the rainfall record for calibration or for stochastic runoff:

 

1) Random--generates a random value between the Max and Min. Any value is as likely as the next.

 

2) Random Normal--generates a random NORMAL number 0-1 as a standard normal deviate, computes the skewed deviate K, and uses the mean and std deviation for V=Mean + StdDev * K.

 

3) Exact Normal--assures no outliers will be produced. It computes the plotting position P as P=(rank-.4)/(N+.2) for each trace value 1 to N, computes the standard normal deviate for P, computes the skewed deviate K, and uses the mean and std deviation for V=Mean + StdDev * K. The 'Reverse' checkbox allows reversing the ranking order so you can make a few traces at the top or bottom of your Prob curve and then close GetRealtime with out running all 100,000 traces.

 

Log Normal uses the mean, std dev, and skew for log values, but you enter the mean and std dev as unLogged values.

 

The 'Max' and 'Min' values are not limits to methods 2 and 3. GetRealtime limits the values generated to zero or greater and less than where needed (Curve Numbers).

 

For stochastic runoff you can vary the single rainfall record using any of these Monte Carlo methods on the P1 'Rainfall Adjust Factor'. You can use multiple runs how ever you dream up to use it. For soil starting conditions you could make say 10 runs of 1 or 30 days with Run-1-1, Run-1-2... out to 10 with a varied (monte?) runoff coefficient like initial CN. Now you have 10 traces with starting conditions in table mday that you can select from by setting the 'Starting Conditions Date' and its Run ID and Trace ID. If Run ID is blank then the starting conditions will be for the date in the usual rday table.

 

To create one or multiple rainfall temporal distributions you have to create you own rainfall record just like your current rainfall record above and apply the Monte Carlo. Where you get this record and normalize it and how many sounds like work to me. Lots of ways I guess.  If anyone can put all this to good use I am certainly open to adding anything more that is needed.

 

For a trial storm you could try using GetAccess to load some runit radar 5-minute values for any DSID for 1 day, select Graph, Update database with Excel, and you will have an updatable runit list of dates and values. At the top of the Excel sheet you can change the 4-10xxx value to 4-10yyy where yyy is your new DSID or change the dates and use the same DSID. Now we need some rainfall record, so do an Excel 'File', 'Open' and load from your GetNexrad folder the text file GetNexrad_6hour_Dist.txt and copy one of these 6-hour normalized storms of 1" back to the Edit workbook and you can update these values back to the GetAccess table runit. Now all you need is the mean, std dev, and skew for monte carloing this normalized storm. You could take the noaa atlas 2yr 6-hr and 100yr 6hr and compute the mean and std deviation with zero skew but to add in the 10yr and figure the skew is beyond me. You could try my free GetRegression's Probability option on my More Stuff page and trial and error the stats for skew.

 

For Las Vegas I found by GetRegression trial and error:

N0AA Atlas 6-hr storm: 2-yr=0.79" 10-yr=1.30" 100-yr=2.12" ... that the log10 mean is the 2 year, std dev in log10=0.165, and the skew is 0.38 just to show you can trial and error it. So for GetRealtime Monte Carlo the mean and std dev are these UNlogged values 0.79, 1.462 and 0.38 skew with 'Log Normal Distribution' CHECKED.

 

If you would like to familiarize yourself with these normal deviates (oxymoron) and their devination you can download this Monte Carlo Excel sheet and look at the simple VBA code for what is going on.

 

What is really needed is a simple stochastic rainfall pattern generator to avoid all the leg work with coming up with real rainfall distributions. I searched the online stochastic literature, but with my limited math skills I could not find much of anything useful. I did find a very clear article by the NWS on their Probabilistic Quantitative Precipitation Forecast that stated that rainfall frequencies could almost always be fit with a special case of the Gamma distribution. This made me recall that the Gamma distribution could also be used to fit the triangular unit graph... that made me think of my new found Linear Reservoir hydrograph that looks a lot like the power function of the Gamma distribution. So putting 2 and 2 together with a random number between +-0.5 gives VIOLA!!! The ACME Linear Reservoir Stochastic Rainfall Generator guaranteed to satisfy your every rainfall problem with just one easy parameter:

  

      Rain2=Rain1*e^(random*F )   where F is a 5-minute shaping factor and series normalized to 1".

  

 

I was dumbfounded as to its actually working meaning it sure looks like rain. One parameter between 0 to 3 should come close to any storm type of any duration with a 5-minute time step. For California 24-72 hour stratiform storms try a F=0.3. For frontal thunderstorms east of the Rockies of 6-12 hours try a F=1.0. For desert Southwest sporadic monsoon thunderstorms of 3 hours try a F=2.0.

  

Now that we have a simple but usable stochastic rainfall pattern generator any of the 'What Ifs' can be combined with the 'Stochastic Rainfall' option. I would imagine though that one of the Monte Carlo options with the 'Rainfall Adjust Factor' chosen is how it would normally be used. The stochastic rainfall values will be saved to the M-tables if you choose that option. And again, if you do not need the unit values saved then uncheck the 'Write Unit Values' to speed things up.

 

A couple of uses come to mind, one is where you hold the 100-year 6-hour rainfall P1 adjust constant and let the pattern do its thing, and another where you know the 72-hr max annual rainfall totals mean, stddev, and skew and let the Monte Carlo generate 72-hr totals for each run and let the pattern again do its thing. Longer periods like 90 days may take some more thinking, but you can give it a try.  This is starting to get out of hand but I guess I should add randomizing the number of storms in a 90-day period but I am running out of room for checkboxes.

 

Update 10/31/2013--GetRealtime 3.2.1--If you would like an *Alert* printed and beep sounded on values exceeding a given value now you can. To do this you need to modify the 'GetRealtime_setup.txt' file's field 'Datatype_name' like:

KALHUEY99;10612;Rainfall>0.5"-hour; Hueytown, AL

Then whenever the hourly value exceeds 0.5 the alert is printed to 'GetRealtime_Alert.txt'. If you want to check a running sum like for 24 hours then use this:

KALHUEY99;10612;Rainfall>0.5inch-hour-24sum; Hueytown, AL

You can use this with any datatype and you can use tables hour, day, daymax, and unit in the Datatype_name. The values are checked as they are being written so it's only for the period retrieved. You can maximize the GetRealtime window and use the 'View Alerts' button to view the 'GetRealtime_Alert.txt'. You can delete this file as needed.  If you are clever, you can automate reading this file and doing something about it. Remember you can shell GetNexrad.exe and GetGraphs.exe and they can write to the web.

 

To have the alert message sent from your email provider as email or cell phone text add the following email address info to GetRealtime_Setup.txt:

Alert Address= MailTo, MailFrom, ServerPassWord, SmtpServer, SmtpPort (25 or 465 for SSL) , optional info

Example: Alert Address= joe@yiho.com, me@lu.com, mypassword, smtpout.myserver.net, 25, http://www.getmyrealtime.com

 

Here's a list of SMTP mail servers to find your email provider.

 

The MailTo value using SMS cell phone text message could be for AT&T...

MailTo: 1234567890@txt.att.net

 

Here's a list of cell phone SMS MailTo addresses.

and add -email on the end of the 'Datatype_name' like: Rainfall>0.5"-hour-email

 

For multiple MailTo's, separate each email address with a single space.

 

To add a flag like Flood Level with different levels 1,2,3... set your datatype_name cell like:

COMPUTE; -11340; Rainfall>1.1-unit-12sum-email; Hot Creek, CA

ROUTE; -1340; Flow>-daymax-email-50cfs Minor,80cfs Moderate,140cfs Major; Hot Creek, CA

LOOKUP; -2331; Stage>-unit-email-0.85ft Minor,2.2ft Moderate,5ft Major; Hot Creek, CA

LOOKUP; -2333; Stage>-unit-email-0.85Minor,2.2Moderate,5Major; Hot Creek, CA

You should stick to alert levels 'Action', 'Minor', 'Moderate', and 'Major' but you can use what ever you wish.  And becareful how you use the '-' field separator.

  

All database tables will find the first occurance of each level. A special case is using DAYMAX with a single value (not levels). In this case the Maximum for the period being written will be returned as the alert value.

  

For email CC's and BCC's add these to the GetRealtime_setup.txt:

Alert Address CC= space separated list

Alert Address BCC= space separated list

And to ignore these CC's and BCC's use '-emailTO' instead of '-email'.

And if you would only like to include CC and BCC for a level 2 or higher use -emailTO2 or -emailTO3 etc.

 

When running in batch mode, the alert emails will be sent twice, then turned off for 12 hours, then back on for twice again and so forth. You don't want to get 100 emails for the same darn thing do ya.  Alerts begin checking values 1 hour ago or if Sum then the sum period plus 1 hour ago.  If you would like to start checking alerts earlier (or later) than 1 hour ago, then use the GetReatlime_Setup.txt line where -2 is 2 hours earlier than NOW:

 

Alert Start Time=-2

 

Note: GetRealtime is using the Microsoft CDO library (cdosys.dll) that ships on most Windows PC's. If you get an error I would recheck that your MailTo and MailFrom values look like email addresses and that you got the right port 25 (non SSL) or 465 (SSL) value as supplied by your email provider.

 

To display the Alerts on GetGraphs add 'Alert' to the GetGraphs setup line like:

 

 1 ;-2329 , 2609, Alert ;Stage;Big Cr at 24th St;......

  

BUT...Alerts cannot be displayed on the very first Getgraphs setup page so choose the 2nd or other graph page for alerts.

 

The alert value and and time issued will be plotted as triangles for dsid -2329. If the alert value issued does not change then no triangle will be added until the issued value changes or is a day old. The alerts are read from ...\GetRealtime\GetRealtime_Alert.txt.

 

The GetGraphs chart above shows what happens with a 3-Hour Nowcast when a squall line forms and you're not paying attention (needs Maddox 30R75 winds aloft adjustment).

 

  

Update 11/16/2013--GetRealtime 3.2.2--has been updated to write its MS-Access database values to a Hec-DSS data file in real-time. So you can use my GetRealtime to achieve all your flood forecasting system needs, or you can now use GetRealtime to just provide Hec-HMS or Hec-Ras your GetRealtime radar rainfall or runoff and let them take it from there.

 

Here is an example GetRealtime_setup.txt to write the DSS data files (No 'rtable' change needed):

 

COMPUTE-Put-unit; 30619; Runoff; Village Creek Upper Sub Bham, AL; -1; VillageCrSubs.dss

COMPUTE-Put-unit; 30618; Runoff; Village Creek Lower Sub Bham ,AL; 0; VillageCrSubs.dss

 

The -1 means delete the DSS file first and the 0 means just add to it. You will need to reinstall over yours the FULL GetRealtime 3.2.2 download to get the needed heclib.dll support files. No need to uninstall, just reinstall.

 

If you want to write 15 minute flows to DSS instead of 5 then use 'Put-15unit' like this:

COMPUTE-Put-15unit; 30619; Runoff; Village Creek Upper Sub Bham, AL; -1; VillageCrSubs.dss; P1+10.24

 

 

Now we have a DSS file for Hec-Ras or Hec-Hms.

 

Update 11/23/2013--Running Hec-Ras Unsteady Flow projects from GetRealtime. To run your Hec-Ras project from GetRealtime you will first have to write a Hec-DSS file of your inflow hydrograph as discussed above. Running Hec-Ras requires version 6.x and GetRealtime.exe > 4.4.2.   Download GetRealtimeRas41.exe to use Ras 4.1.  Download GetRealtimeRas5.exe to use Ras 5.x.  Ras 6.1 installed on my Windows 10 but Ras 6.1 would install but not start up with out error on my Windows 7.  In Win 7 I renamed sqlite3.dll in C:\Windows\SysWOW64 and Ras 6.1 installs and starts up without error.  I don't know what program sqlite3.dll belongs to.  When changing Ras version set Initial Conditions to an inflow value to update restart files and Run the Geometry process and save all.   Current GetRealtime.exe references Ras version is 6.4.1.

 

HEC-RAS tip for beginners: For routing multiple subbasins create one long reach with at a minimum one x-section at the top and another at the bottom. Subbasin unsteady hydrograph inflow Boundary Conditions can be one at the top of the reach and all others at chosen interpolated x-secs as **Side Inflow** boundaries. Only took a week watching Youtubes without ever finding a good example before figuring this out myself. DO NOT try the obvious way of multiple side reaches with Joints (that way lies madness).

 

For this example our Hec-Ras project file will be created in this folder and name:

 

C:\MyData\HecRasLowerBham3\LowerBham.prj

 

 ...have GetRealtime 'Put' a dss file of inflows like this:

GetRealtime_Setup.txt line:

COMPUTE-Put-unit; 30619; Runoff; Village Creek Upper Sub, AL; 0;C:\MyData\HecRasLowerBham3\VillageCrSubs.dss

  

Now that we have a DSS boundary inflow file in our project folder we can start Hec-Ras and create a project and save it where we put the inflow DSS file. As a quick example for a Hec-Ras unsteady flow project start up Ras:

 

1) File> New project and locate the folder above and save project.

2) Edit> Geometric data> River Reach ... and draw a line and double click to end and enter the River name and Reach name.

3) Cross Section> Options> Add a new Cross Section ... and give it station id and add the xs data. Then hit Apply Data.

4) Repeat step 3 for a downstream xs by Edit> Options> Copy current cross section> Add new Cross Section and Options> Adjust Elevations.

5) Exit and from the Geometric Data menu File> Save geometric data... and close window.

6) Now we need a unsteady flow plan BUT... before that we need to set the run TIME WINDOW or else you cannot select the plan boundary inflow hydrograph. To set time window Run> Unsteady Flow Analysis> and on the Simulation Time Window set the start and end times and be sure to click on the year label and set to something reasonable. Once you set the time window date and times, close the menu.

7) NOW!!! Edit> Unsteady flow data> click on the upstream Boundary Condition cell, click Flow Hydrograph, Select DSS file and Path, click on the little Open File ICON, and select the dss file VillageCrSubs.dss from our work above and you should be able to Select highlighted DSS pathname(s), and plot them. If you can plot them then they are in the correct time window we set in step 6. Click OK... and then OK.

8) Back at the Unsteady Flow Data menu, click on the downstream xs Boundary Condition cell, Normal Depth, and enter the channel slope. And save by File> Save Unsteady Flow Data... and close window.

9)From main menu, Run> Unsteady Flow Analysis> enter Short ID, and click all 3 Process to Run, set all 3 Computation Settings to 5-minutes... and I think we are set to go... now hit Compute!!! Hot Dam!!! it ran.

 10) Whew... now to watch a water level movie, click View> Cross-Sections> and click the Animate Button. Cool eh! I'm sure the experts are chuckling on the steps above but it is how I got a Hec-Ras project to work so if you know better, just skip all the craziness above and create a project.

 

 

Now to run Ras.exe from within GetRealtime.

 

GetRealtime_Setup.txt line:

SHELL-C:\Program Files\HEC\HEC-RAS\5.0.3\ras.exe; 0; Flow; Village Creek HEC RAS Lower Sub; -1;C:\MyData\HecRasLowerBham3\LowerBham.prj

 

The -1 means watch it run, 0 is hide it. When GetRealtime is run it will open the .prj file, find the current plan (p01), open the .p01 file and set the Simulation Date time window to the current retrieval period, and away it goes. The Ras.exe is not directly shelled but is referenced by it's HEC River Analysis System... which is Ras.exe but makes all it's objects available to call. I used the GetRealtime 'SHELL' command in case other programs can be shelled using the same setups in the future.

 

But if you enter 'ShowRas' in the 'formula1' cell setup like:

 

SHELL-C:\Program Files\HEC\HEC-RAS\5.0.3\ras.exe; 0; Flow; Village Creek HEC RAS Lower Sub; -1; C:\MyData\HecRasLowerBham3\LowerBham.prj; ShowRas

 

Then the Hec-Ras program WILL be shelled so you can admire your project after GetRealtime ends.

You can also delete unwanted files before the Ras run like this formula1:

  showras, delete.dss, delete.bco01

Delete.dss automatically also deletes the .dsc catalog file. I found that deleting the DSS output file makes things simpler in the beginning.

 

If you want to shorten your Ras run time to just around the peak then add 'PeakN' to the formula1 list like:   showras, delete.dss, delete.bco01, Peak12

For this Peak12 example the Ras run will start 12 hours before the max peak of all runoffs and flows and 3*12 or 36 hours after the max peak for the run. This will speed things up if all you want are peaks for Ras Mapper.

 

Now to get the Ras DSS file results back into GetAccess database add...

COMPUTE-Get; 1621; Flow; Village Creek Upper Sub Routed, AL

 

The GetAccess table 'rsite' for this Get is:

621; 1621; Village Creek Routed Subs, AL; 1; flow; cfs; cubic feet per second; GetUnitHecRasLowerBHAMrouted.txt; COMPUTE-GET; 6; NC; 0;

 

For the GetUnitHecRasLowerBHAMrouted.txt filename. This file in your GetRealtime folder will read:

 

Hec-Ras Routed Bham Upper Sub to Gage

Put Table: runit

Get Database_connection:C:\MyData\HecRasLowerBham3\LowerBham.dss

Get SQL: /VILLAGE CR AIRPORTTOGAGE/10/FLOW/DAY1/5MIN/SHORT2/

END

 

The Get SQL is the DSS 'path' name that you can copy from HEC_DSSvue.exe. The /DAY1/ part does not matter because it gets replaced with blank because it is not needed.  Note: Do Not Include 'HMS' In Your Title.

 

If your Ras project takes a while to run and you don't want to start at the beginning of the runoff period, then you can create a HotStart file.

1) To use a HotStart (aka Restart File) with date and time under the Ras Menu 'Run':

 Run, Unsteady Flow Analysis, Optioons, Output Options, Restart File Options:

Check Both (or at least first) Write Intial Condion file...

Set Hours from beginning of simulation: [ 1 ]

Hours between writes: [ 1 ]

Now GetRealtime will set run start date time to match a .rst hotstart file datetime<= to your startdate.

2) For the first time w/o a .rst file, set your Initial Conditions option to 'Enter Initial flow distribution'.

3) Run your project.

4) Reset your Unsteady Initial Conditions to 'Use a Restart File' and select the rst file.

5) Save you project.

In GetRealtime_setup.txt add 'hotstart' to the delete list in 'formula1' cell like: delete.dss, delete.bco01, hotstart

Now when ever you shell Ras, it will start on the new .rst hotstart file datetime<= to your startdate. The old hotstart .rst files older than 7 days will be deleted at the end of each time run if you add 'delete.rst' to the formula1 cell. To run starting with beginning of the runoff period set by GetRealtime's runoff comps, just remove 'hotstart' from the formula1 cell setup... but be sure your last hotstart written is at base flow or who knows what will happen!?!?

  

Additional HotStart options to add to the deletes are:

,Peak n,

This will get the maximum runoff for the run period and start RAS n hours before the peak and end RAS 3*n hours after the peak.

,Now n1-n2,

This starts RAS n1 hours before the current run time and and ends RAS n2 hours after the current run time. Example: ,Now 1-3 This starts 1 hour earlier and ends 3 hours later.

  

If you need a higher low flow to avoid instability problems, then use a P1+50 where needed on the upstream dss PUT and then a P1-50 on all the GET's.

 

And to skip running RAS when not needed, see the Station_ID 'IF'. This allows skipping the next N lines of the GetRealtime_setup.txt if peak flow does not warrant unsteady routings or stage mappings.

 

If you would like to capture the image of your RAS Mapper window (Ras 5 only) at the end of the Ras run then add the option ,RasMapperPic to save for display by GetGraphs. This will capture the image to png like this:

RasMapper 5

 

Update 1/8/2014--Running HEC-Hms runoff projects from GetRealtime. To run your HEC-Hms project from GetRealtime with gage or adjusted radar we first have to write a Hec-DSS file of your rainfall hyetographs as discussed above. For this example our HEC-Hms project file will be created in this folder and file name:

C:\MyData\VillageCr\VillageCr.hms

 

First have GetRealtime 'Put' a dss file of rainfalls with base -1 overwrite like this GetRealtime_Setup.txt line (-1 on first sub-basin only):

 

COMPUTE-Put-unit; 11616; PRECIP-INC; Village Creek; -1; C:\MyData\VillageCr\MyVillageCrRainfall.dss

 

Now you can create your VillageCr.hms project and get it running. Set the 'Time-Series Data', 'Precipitation Gages', 'Gage 1', 'Time Window', 'End Date' to a future year so you won't have to deal with it till then again.

 

Once your project is running using the DSS rainfall you can now run GetRealtime to automate HEC-Hms, so close HEC-Hms.

 

Important Dec 5, 2018: HEC-HMS.cmd version 4.3 does not seem to run script (Failed to register GDAL/OGR for use (truncated path?)). I contacted COE for help and their system ran fine. Maybe next HMS update will fix my problem.   4.3 runs on my Win10 but not my Win7.  Tested Hec-HMS 4.5 shelling in Win7 and works fine. I think HMS 4.3 was using way too much environment variable space for my Win7 to handle. 

 

Update Feb 16, 2022:  HMS 4.9 was running fine on Win7 for a few weeks and I did something and now it won't.  Win10 never did shell 4.9.  Back to HMS 4.8 until the next version.  Sometimes version 4.9 will shell ok after rebooting. You might try uninstalling all older HEC-HMS versions?

 

For Windows 7 with HMS version 4.9 and greater copy HMS version 4.8 file HEC-HMS.cmd to your new HMS install folder. Apparently the newer CMD file is for Windows 10 only.

 

GetRealtime_Setup.txt line:

SHELL-C:\Program Files\HEC\HEC-HMS\4.1\hec-hms.cmd; 0; Flow; Village Cr HEC HMS Example; -1; C:\MyData\VillageCr\VillageCr.hms, Control_1.control, run 1, 5-Minute; delete.dss, delete.log

 

 

The first 0 is a DSID that is not used. The base1 -1 displays the HMS computation window and run log, a 0 turns it off. The shift1 is the HMS project path name, control file, Run ID, and time-step. The formula1 are optional deletes. The 'run_1.log' file is always deleted before running so you can quickly see if things went wrong. You can use any valid run id. The time step is always 5-minute for radar but you could use steps like 10-Minute, 15-minute, 1-Hour, 6-Hour.

 

Shelling Hec-HMS in batch mode is not as straight forward as you may think. HMS is not an ActiveX component as is HEC-Ras and to shell I had to use the Windows API to get a process handle for completion notice of the script HMS runs (GetRealtime_hms.script). The script file is generated automatically if it does not exist.  Warning! If you copy your HMS project to another folder DELETE GetRealtime_hms.script before running the GetRealtime shell or your project path will not change.

 

After GetRealtime completes the shelling of HMS then the results of the run will be in VillageCr.dss. To retrieve the results back to GetAccess HDB, follow the steps in the HEC-Ras example above.

 

Your rtable setup file will look like this:

 

Hec-HMS Village Cr

Put Table: runit

Get Database_connection: C:\MyData\VillageCr\VillageCr.dss

Get SQL: //SUB1/FLOW/07JAN2014/5MIN/RUN:RUN 1/

END

 

And again the /date/ is just a place holder that gets replaced. The letters 'HMS' have to be in the title line.

 

HMS will cut off your recession if you don't include enough starting days. To automate the start day to avoid this, include at least one GetRealtime runoff subbasin setup and you can adjust this sub's lag time and area to provide sufficient recession time.

 

I think my GetRealtime continuous CN loss, unit graphs, optimized Modpul, snowmelt, stochastic rainfall, Monte Carlo methods, AND DATABASE are better than HMS so I hope you give them a try. Perhaps you may prefer the HMS routings to mine so you could use HMS to do the routings of GetRealtime generated runoffs. Lots of possibilities exist now that HEC-Ras and HEC-Hms can be automated.  Alert and HEC-ResSim users may see how automating everything in real-time with a 7-day 5-minute forecast of runoff would be a pretty slick deal. But the real key to success here is good adjusted radar rainfall so all this work is useless if you can't get that done and done right and you're the only man in a position to do it.

 

 

Update 2/3/2015--Running HEC-ResSim reservoir projects from GetRealtime. To run your HEC-ResSim watersheds you will want to put the needed inflows to your DSS file, shell Ressim, and then get the results back into GetAccess from the simulation DSS file. So just like HMS and RAS here is a GetReatime_Setup.txt example:

 

*******; ********; *********; ***HEC-RESSIM********

COMPUTE-Put-hour; 30801; Flow; Browns Creek; 0; C:\MyData\base\Testing_of_functions\rss\functiontest.dss

 

COMPUTE-Put-hour; 30801; Flow; Browns Creek; 0; C:\MyData\base\Testing_of_functions\rss\TestOfResSim\simulation.dss; lookback=1

 

SHELL-C:\Program Files\HEC\HEC-ResSim\3.1\HEC-ResSim.exe; 0; Flow; Browns Cr; -1; C:\MyData\base\Testing_of_functions\rss\TestOfResSim.simperiod; lookback=1

 

COMPUTE-Get; 1912; Flow; Browns Cr HEC Ressim Example

*******; ********; *********; ***********

 

The GET file could read as:

 

Hec-Ressim DS CP2 flow

Put Table: rhour

Get Database_connection: C:\MyData\base\Testing_of_functions\rss\TestOfResSim\simulation.dss

Get SQL: //DS END OF MAIN STEM/FLOW/01FEB2015/1HOUR/FUNC_TEST-0/

END

 

As of yet, I cannot automate the HEC-ResSim.exe executable, only shell it, and let the user select his project and run it. When the project is closed, GetRealtime will then get the results. The first PUT above to 'functiontest.dss' is probably optional after you get your project built and running. I hope to figure out a ResSim Jpython batch file. If you know Jpython and can create your *.PY file then the GetRealtime shell could be like:

 

SHELL-C:\Program Files\HEC\HEC-ResSim\3.1\HEC-ResSim.exe "C:\MyData\base\BatchRunControl.py" "C:\MyData\base\Testing_of_functions"

 

Here is a COE Jpython example with documentation, only it is INTERACTIVE and not what I need but somewhere to start:  BatchRunControl.zip

 

 

I think Hec-RTS is the Army's attempt at doing what GetRealtime is way better at.  ;-) ...after reading this...

 HEC-Rts Draft

 

Just use GetRealtime... or shell Ras and Hms if you must. I don't think calibrating your losses on the come, worrying about dead gages, using 4 hour old USGS flows, and knowing your radar grid is an hour or 2 outdated after it starts raining and the dam is about to break is such a good idea. If the radar loop bearing down on you and NWS QPF says run for the hills, I would take them at their word. You can always blame the NWS later like state governors do.  But if you're running Grand Coulee I guess you can afford such luxuries.  Actually the soil adjustment to match the USGS current base flow idea works pretty neat and has been added to GetRealtime.

 

If you don't believe me then you could use GetRealtime as an overall check on conditions, a middle ground 2nd opinion, or as complete redundant backup system. With your data in a real database your in-house users can actually make use of it.

 

 

Automate EPA-SWMM: Once you build your SWMM model and get it running (or see the Nashville SWMM example) you can then edit your GetRealtime_setup.txt to PUT, SHELL, and GET like:

 

COMPUTE-Put-unit; 11801;  intensity; Lumped Nasville Basin; -1; C:\GetNashville\NashvilleSWMM\TS1.dat

 

SHELL-SWMM; 0; Flow; Nashville Subs; -1; C:\GetNashville\NashvilleSWMM\MyTutorial.INP

 

COMPUTE-Get-SWMM; 1801; Runof; Nashville SWMM subcatch #1; 0; C:\GetNashville\NashvilleSWMM\MyTutorial.out; Type=Catchment, Item=0, Variable=4

 

COMPUTE-Get-SWMM; 1806; Flow; Nashville SWMM Outflow; 0; C:\GetNashville\NashvilleSWMM\MyTutorial.out; Type=Node, Item=4, Variable=4

 

Copy your C:\Program Files\EPA SWMM 5.1\swmm5.dll and vcomp100.dll to your PC's system32 or your syswow if 64-bit or if you prefer to your path.

 

SWMM PUT setup:

To create a SWMM data source file for rain, the datatype_site_id is 11801 as GetAccess data source.  The datatype_name is either intensity or cumulative. I have never heard of a volume of rainfall (gallons?) so the put file will have unaltered GetAccess incremental rainfall if volume, so go with intensity to be safe. Be sure your SWMM project's rain gage property Time Interval is set to 0:05 for 5-minute radar or 0:10 for Canada 10-minute radar.  The base1 -1 means overwrite the SWMM data file (actually overwrite is the only option), the shift1 is the SWMM dat file name and the file name itself (TS1) is used as the SWMM Station ID.

 

SWMM SHELL setup:

SHELL-SWMM will run SWMM from 'swmm5.dll' which you should copy from C:\Program Files\EPA SWMM 5.1\swmm5.dll and vcomp100.dll to your PC's system32 or your syswow if 64-bit or if you prefer to your path. The rest of the fields are not used except shift1 which is the SWMM input file name (*.INP). GetRealtime will edit the *.INP file's Start and End times to the period being run.

 

SWMM GET coefficients:

The datatype_site_id is 1801 for the GetAcces database, the base1 is ignored, shift1 is the SWMM output file (*.OUT), and formula1 are the SWMM coefficents for the get.  You should probably check the unit values in GetAccess with the SWMM GUI object report table values to be sure you got the right TYPE, ITEM, and VARIABLE ID numbers. I'm a rookie here but it seems if you click on a map object, then report, table, by object, will display the VARIABLES in order with 0 ID is the first VARIABLE and the TYPE seems to be shown in Oject Category. Apparently you have to inspect the INP file for ITEM count. You may also notice that the GET-SWMM does not need a Get File and the parameter_code in GetAccess table rsite is ignored.

 

TYPE = type of object whose value is being sought: subcatchment, node, link, system

 

ITEM = argument identifies items of a given TYPE that are reported on by consecutive numbers starting from 0, which are assigned in the same order as the item appeared in the SWMM 5 input file. As an example, suppose that a system contains 5 junction nodes with ID names J1, J2, J3, J4, and J5 that are listed in this same order in the project's input file, but output results are only requested for junctions J2 and J4. Then an itemIndex of 0 would retrieve results for J2 while an itemIndex of 1 would do the same for J4. Any other values for itemIndex would be invalid. For System items the itemIndex argument is ignored.

 

VARIABLE = depends on the TYPE being sought and are listed in the Output File as follows:

 

· Number of subcatchment variables (currently 6 + number of pollutants).

· Code number of each subcatchment variable:

0 for rainfall (in/hr or mm/hr),

1 for snow depth (in or mm),

2 for evaporation + infiltration losses (in/hr or mm/hr),

3 for runoff rate (flow units),

4 for groundwater outflow rate (flow units),

5 for groundwater water table elevation (ft or m),

6 for runoff concentration of first pollutant,

 ...

5 + N for runoff concentration of N-th pollutant.

 

· Number of node variables (currently 6 + number of pollutants)

· Code number of each node variable:

0 for depth of water above invert (ft or m),

1 for hydraulic head (ft or m),

2 for volume of stored + ponded water (ft3 or m3),

3 for lateral inflow (flow units),

4 for total inflow (lateral + upstream) (flow units),

5 for flow lost to flooding (flow units),

6 for concentration of first pollutant,

...

5 + N for concentration of N-th pollutant.

 

· Number of link variables (currently 5 + number of pollutants)

· Code number of each link variable:

0 for flow rate (flow units),

1 for flow depth (ft or m),

2 for flow velocity (ft/s or m/s),

3 for Froude number,

4 for capacity (fraction of conduit filled),

5 for concentration of first pollutant,

...

4 + N for concentration of N-th pollutant.

 

· Number of system-wide variables (currently 14)

· Code number of each system-wide variable:

0 for air temperature (deg. F or deg. C),

1 for rainfall (in/hr or mm/hr),

2 for snow depth (in or mm),

3 for evaporation + infiltration loss rate (in/hr or mm/hr),

4 for runoff flow (flow units),

5 for dry weather inflow (flow units),

6 for groundwater inflow (flow units),

7 for RDII inflow (flow units),

8 for user supplied direct inflow (flow units),

9 for total lateral inflow (sum of variables 4 to 8) (flow units),

10 for flow lost to flooding (flow units),

11 for flow leaving through outfalls (flow units),

12 for volume of stored water (ft3 or m3),

13 for evaporation rate (in/day or mm/day)

 

  

The Army and I think alike when it comes to thumbnails on the radar screen (GetNexrad):

Or just a quick look at all your subs and adjusting rain gages on the radar:

And the pages of rainfall, frequencies, and runoff graphs and web info in GetGraphs:

 

Other Shell Examples:

The first time you run these Shells you will need to run GetRealtime.exe as System Administrater and give permissions when prompted.  Update Jan 27, 2021: These shells (not Hec or Swmm) will time out after 3 minutes. You can change the 3 in GetRealtime_setup.txt (save it first if missing).

  

For Windows 10 you may need to change security settings Allow App Through Firewall.

 

The GetRealtime_setup.txt lines can use the GetNexrad and GetGraphs command lines.  Only the first field (Station_id) and the last field (base1) has any meaning so don't over think this:

  

SHELL-C:\GetRealtime\GetNexrad\GetNexrad.exe -2; 0; Pic; Nashville Radar Window;-1

 (Saves to file and you could then add the png file to your GetGraphs setup list.)

The -2 means capture the window, a -1 would mean capture just the radar image.  To FTP to your GetNexrad ftp setup include your password: GetNexrad.exe -2 mypassword; ...load default and save window to png file and FTP it:

 

SHELL-C:\GetRealtime\GetNexrad\GetNexrad.exe -2 MyPassword; 0; Some Pic; Nashville sub 1;-1

  

The ending base1 -1 displays GetNexrad, 0 hides it but only when saving the radar image, not the whole window which will always display.

 

Update 1/5/2021 now GetNexrad can use multiple setup files so you can do radar selections and surface obs and save to GetNexradFilesxxxxxx.txt and then reload GetNexrad from a command line with the new setup flilename.   A simple example would be to include a radar image of the whole USA radar, NWS flood sites, and QPF rings and it updates automatically.

 

New command lines could be:  

GetNexrade.exe -2 GetNexradFiles33.txt mypassword     (ftp)

or

GetNexrade.exe -2 GetNexradFiles33.txt    (no ftp, just create png for GetGraphs)  

old way:

GetNexrade.exe -2 mypassword    (use old default GetNexradFiles.txt and ftp)  

 

  

Likewise, GetGraphs can be shelled similarly and png's FTP'd:

SHELL-C:\GetRealtime\GetGraphs\GetGraphs.exe 1 MyPassword; 0; graphs; Nashville GetGraphs

  (where 1 is the pause in seconds between GetGraphs paging.  there are no window or image or display hide options.)

 

On GetGraphs you have to select 'Auto Paging' and check the 'Write page image on update' and then fill out and 'Save Web Info'. This creates the file 'GetGraphs_autoweb.txt'. Then you're all set up and your web password if needed is supplied by the GetRealtime SHELL command.

  

And if you don't want to FTP a particular page use FTP_OFF on the GetGraphs_setup.txt page line.

  

  

To shell and wait for BAT or SCRIPT files:

SHELL-CopyMyFIles.bat; 0;Copy my Pic; Pic of Tallapoosa basin; 0

 (SHELL dos commands like COPY can be put in a BAT file.)

 

SHELL FTP EXAMPLE: To Shell FTP and wait for it using GetRealtime.exe, put the following in a script text file like C:\mydata\MyFTPer.txt :

---------

open ftp.getmyrealtime.com

user myrealtime mypassword

cd MiscImgs

binary

put C:\myImgData\Graphimage.png

bye

-------

The GetRealtime_setup.txt line would be:

SHELL-ftp -n -s:C:\mydata\MyFTPer.txt; 0; MyInfo; Send me to somewhere; 0

 

Note there is no space between s: and C:. Like wise NEVER use spaces in file names or you are just asking for trouble in life.

  

To automate retrievals from the GetAccess database from a batch file or directly from GetRealtime.exe you can use my 3 programs GetFromAccess.exe/GetFromAccessNOW.exe and GetFromAccessSQL.exe. Like the names imply, the first needs your dates to retrieve or days relative to now, while the other uses any SQL statement. GetFromAccess.exe can even create HEC-DSS files. The advantage of the SQL method is the use of SQL functions like NOW(), SUM(), and MAX(). Like give me the past 6 hours... give me the sum of the past 6 hours, give me the max of the future 48 hours. These are intended for automating retrievals for FTP'ing to your WebSite.  Run the Exe's with no command line to display help examples.

  

For help with GetFromAccessSQL batch look at these files in the zip download:

Help_GetFromAccessSQL.txt

SQL_Now_Examples.txt

MySQLGetAccessTXT_EXAMPLE.bat reads as:

GetFromAccessSQL -11361 -30361; SELECT * FROM rhour WHERE datatype_site_id=DSID AND date_time BETWEEN (Now()-0.25) AND Now() ORDER BY date_time; MyTextOutTest.txt

 

To create an Excel file try .xls instead of .txt:

GetFromAccess rhour, -11321 -11323 -11326, 2016-11-01 00:00, 2016-11-02 00:00, myTest.xls

   

As an example of putting a data file on the web you could use this setup line in GetRealtime:

SHELL-C:\GetGraphs2\GetFromAccessNOW.exe rdaymax, 10347 -10347 17340 -23363 11363 1332 -1360, -2, 5, BurroCrNOW.txt, mypassword; 0; To Web; Put text file on web; 0

Note that I had the exe in my GetGraphs folder where my GetGraphs_autoweb.txt file has the needed web info.

 

To create a small BABY.mdb database file and FTP it you can use PutFromAccessNOW. It will read your large database and create a small version of values only for that run. This could be handy to FTP a small MDB file to someone else running GetGraphs with just the needed current values for graphing.

 

 

NWS Forecasts:

 

The NWS 7-day forecast dsid's and table rsite parameter_codes:

10 or 11 qpf

16 dewpoint

17 temperature

18 humidity

19 wind direction

28 windspeed

26 cloud-amount

 

NWS snow model data is above for 48 hour Melt and SWE, Depth and Density, Rainfall and Snowfall, Air Temp and Surface Temp

 

NWS forecast of stage and flow add the NWS gage ID like this:

 

FORECAST-NWS,wsbt2; 1409; Forecast Flow; USGS Flows below Reservoirs

 

If your datatype_id from table rsite is 1 then flows are retrieved, else stage.  I would expect one already has the USGS flow in table rsite as only the forecast values are saved.

 

NWS flood flow forecast info:  http://water.weather.gov/ahps/

  

GetRealtime can add the NWS hourly QPF's directly and here is the setup. The -2 is a time shift. I chose adjusted radar for the QPF so there is no time shift on that so I had to add it here. You could also put the QPF in the basin radar DSID and the gage adjustments will not change it.  For flow and stage the date_times are adjusted to your PC's time zone from Zulu for you.

 

Very important update 7/24/2017: The NWS QPF forecast FORECAST-NWS must come before the radar Nexrad-ESX with Nowcasts (and HRR-ESX) in the setup because of the addition of the HRRR 18-hour forecast. See MRMS-HRR below for proper setup sequence.

 

FORECAST-NWS; 11616; Rainfall; Birmingham 3G Adj BASIN Forecast; -2; 33.52, -86.82, 3

  

(Always put temp, humidity, wind, cloud-amount AFTER rainfall forecast so distribution will be read.)

 

Note the Lat,Long. I would use the same Lat,Long for a general area for each subbasin.  The NWS QPF forecast grid width is about 4 miles and updated hourly at hh:45. GetRealtime checks to see if the Lat,Long has changed before re-retrieving the QPF. Place all your QPF's for the same Lat/Long together so the QPF's only get retrieved once. And like GetNexrad which distributes each 6-hour QPF, GetRealtime also distributes as 0=average, 1=Denver 2-hr design storm, 2=Dallas 4-hour, 3=Vegas 5-hour, 4=California 6-hour storm, and 5=etc. You may need to copy the file GetNexrad_6hour_Dist.txt to your GetRealtime folder.  In the example above, 3 means Vegas distribution. You may note that the hourly QPF is the same value for 6 hours, hence 6-hour QPF. GetRealtime detects each new 6-hour starting time and restarts the distribution. The nearest 6-hours may be a little screwy as the distribution assumes it will go for 6 hours when in fact it may reset after 1 hour as a new 6-hour QPF begins.

 

I can see I have a lot to learn about forecasting. The problem is the NWS QPF is for 6 hours. The distributions above work great for periods more than 6 hours out, neglecting peak timing. For the next 1 to possibly 6 hours those distributions can zero out the forecast just as it should be peaking. So the the first 1 to possibly 6 hours of a forecast will be distributed by heavily weighting the first hour as below and reduced each hour with the distribution not used.  For Vegas the first hour of the next 6 would be 3X, then the 2nd hour 2X, then 1X, then zero.  Distribution 5 and higher will use average for up to the next 6 hours.

 

Storm_Type   FirstHour   NextHourReduction

1-Denver      5      4

2-Dallas      4      2

3-Vegas      3      1

4-California      2      0.4

5-E & Beyond  1     0

 

This means when starting the forecast you could be doubling your forecast for periods less than the full 6 hours so during events you may want to switch to Average storm type and/or edit the first 6 hours with the Forecast Wizard to account for history which is very easily done. I'm trying to help. Once the end of the current 6 hourly forecast changes, then the distributions kick back in. So the next 6-hours is critical and you should learn more than I know about this!  As luck would have it, you can use Run-FORECAST-NWS (instead of FORECAST-NWS) to save each forecast in the M-Tables from which you can learn something.

 

The above 2-hour distribution was taken from Denver, CO's Drainage Manual (noaa atlas) and adjusted to 1 inch.  The others are actual storms.  Also, the 5-minute values are capped at 6 in/hr and a beep will sound if check 'Beep On'.

 

What I've found so far: The NWS forecasts are for 6 hour periods beginning 0:00, 6:00, 12:00, and 18:00 hours locally. Sometimes they break them up to 3 hour periods at storm start or end. You definitely want to use a storm distribution beyond the first 6 hours. And you probably want to use it for at least the first 1 or 2 hours with history. Make note of the peak runoff and keep that peak in mind as you go through the next 6 hours. Don't get faked out and think you are way over shooting because these NWS forecasts are damn good! Look at the regional radar mosaic 2-hour loop and point rainfalls up wind with GetNexrad. But in the end the original 6 hour forecast with storm distribution will probably give the closest peak runoff so err on the high side until the closing of the 0, 6, 12, and 18 times above. Use the Forecast Wizard forecast edit to keep the original 6-hour depth until you know the radar is changing to the high side or reduce the last 3 or 4 hours if needed to keep things in line with the radar mosaic. You could use the 0-Average storm distribution and manually edit it so you know where you are at each hour but don't use it as a real forecast. If you use an 0-Average distribution, then look at my front loading methods above and adjust for history. When you screw up, just get the forecast again and start over. History will show how much of the forecast is left to expect based on the 0,6,12,18 periods. And use a Run-Forecast on the first sub and Run-Compute or Run-Route to save all your messes for posterity.

 

http://www.hpc.ncep.noaa.gov/html/fam2.shtml#qpf

 

Form the Forecast Wizard Setup you can edit the next 6 hours of forecast manually. Here is a list of ways to change forecasts:

0) 1, 2, 3 hour automated Nowcasts using winds aloft (GetRealtime).

1) Storm distribution of NWS.

2) Time shift.

3) 5-minute Forecast value adjust as P1*x.

4) 1+ hour storm track forecast (GetNexrad).

5) 6 hour QPF of any amount starting any time (GetNexrad).

6) A wundergage up wind with a time shift???

7) Edit next 6 hours with Forecast Wizard.

8) Use the Forecast Wizard Edit 6 hours to copy the first forecast 6 hours to all other forecasts.

9) Forecast Overwrite checkbox on Scheduled Batch menu overwrites all edits with NWS Forecast.

 

And remember to use Run-FORECAST-NWS to save forecasts to the 'M' tables for post event analysis of where you went wrong and use GetGraph's menu 'Add Trace' to compare.  To view the previous RUN on GetGraphs startup automatically, add ',RUN' to your GetGraphs_setup.txt line for that DSID:

 

 4 ;-11301 ,-10301 , Run ;Rainfall; Sub1 Adj & UnAdj & Last Forecast; 0 ; .....

 

Note that when running 2 or 3 separate GetRealtime.exe's in batch mode that you stagger their start times by 3 or 4 minutes to insure they don't collide updating the GetAccess HDB database.

 

You can run Runoff, Routings, HEC-Hms, and HEC-Ras for the QPF's additional future days. Besides precip, FORECAST-NWS can also be used for SNOWMELT comps based on the datatype_id for temp, humidity, wind speed (and some other things).  Always list the precip QPF first if using a re-distribution from above or it has to retrieve the QPF twice.

 

  

MRMS Multi Radar Multi Sensor products from NCEP:  A2M  and  HRRR.

Both A2M and HRRR products are processed by Iowa Mesonet into ConUS images and archived. Downloaded images use Iowa's map cutting web service for smaller areas of interest (ConUS, 10-Regional, each Radr Site).

 

A2M is a 1-km 2-minute rainfall rate with 255 values. The setup Station_ID is NEXRAD-ESX (for ESX radar). The table rsite parameter_code is A2M-Ridge2 with MRMS-ESX station_id. Rain rates will be converted and saved as 5-minute rainfall increments on a 1-km grid for the subbasin boundary (same as Nexrad-NCQ boundaries). Level3RGB2RainRateA2m.txt is the rainfall rate color conversion table.

 

HRRR is a 1-km 15-minute 18-hour FORECAST of reflectivity with 255 values (same colors as N0Q) but archived by Iowa as a 2-km grid. The setup site_name is HRR-ESX (for ESX radar). Rain rate reflectivity will be downloaded and saved as 5-minute rainfall for the subbasin boundary (same as Nexrad-NCQ boundaries).  HRRR forecasts can also be downloaded using the SpotWx data source as values instead of images and is much faster.

 

Both A2M and HRRR are processed just like my Nexrad-NCQ images using the same subbasin and gage boundary and point files. So if you have created NCQ boundaries then nothing more is needed. So use product code NCQ Ridge2 with LatLongPixels.exe and LatLongPixelsFromFile.exe when creating boundaries. See GetNexradHelp for NCQ boundaries.  Both radars are adusted to your PC's time zone unlike the NWS forecast.

 

You do not need to use A2M in order to use HRRR forecasts. In fact, A2M is probably only wanted for locations over 100 (70 winter) miles from the radar, or in mountainous terrain, or if you have no rain gages. HRRR forecasts can be compared with my Nowcasts and NWS 6- hour forecasts. I'm just shortcutting on this help write up as I am implementing both in code.

 

GetRealtime_setup.txt example lines:

A2M-ESX; -10361; Radar Rain; Burro Cr Sub 1 Rainfall QPE

 

HRR-ESX; -11361; Forecast Rain; Burro Cr Sub 1 Forecast Reflectiviy; 0; 6

Where ending 6 can be 1 to 18 hours of the available HRRR forecast period.

 

Table 'Rsite' A2M line:

361 -10361 Burro Creek Sub 1, AZ 10 rainfall in inches A2M-Ridge2 MRMS-ESX 3 AZ

 

Table 'Rsite' HRRR line uses SAME LINE AS ADUSTED RADAR DSID:

  361 -11361 Burro Creek Sub 1 ESX Radar, AZ 11 rainfall in inches 31347,-10361 COMPUTE 3 AZ

 

I haven't priortized the HRRR forecast source code 24 to overwrite or not overwrite Nowcast 14 or FORECAST-NWS 18 source codes so you will have to put the HRR in the correct setup order of HRRR AFTER FORECAST-NWS. Source 14 overwrites takes place during G/R ratio adjustment to raw radar for adjusted radar rainfall and is last.

  

To clear all the Nowcasts and HRRR forecasts, use the 'Overwrite Forecasts' on the 'Scheduled Batch' menu and run the FORECAST-NWS.  Now you can rerun to add the Nowcasts and HRRR forecasts and use the Wizards Forecast editor if needed.

 

So I would order my 3 forecast sources like:

FORECAST-NWS; -11361; Forecast Rain; Burro Cr Sub 1; 0; 34.81,-113.17,5

A2M-ESX; -10361; Radar Rain; Burro Cr Sub 1 QPE;0; Nowcast 3, adjust type 3

HRR-ESX; -11361; Forecast Rain; Burro Cr Sub 1 Forecast Reflectiviy; 0; 6

Averages and Ratios...

COMPUTE; -11361; Rainfall; Burro Cr Sub 1 Adjusted ESX Radar, AZ; 0; -1, 1.0, 0.3, 0.0; P1*P2

  

GOOD ORDER:

FORECAST-NWS; -11101; Forcast Rain; Sub 1; 0; 34.8044,-85.0242,3

FORECAST-NWS; -11102; Forcast Rain; Sub 2; 0; 34.8044,-85.0242,3

 

NEXRAD-HTX; -10101; Radar Rain; Sub 1; 0; Nowcast 3, adjust type 3

NEXRAD-HTX; -10102; Radar Rain; Sub 2; 0; Nowcast 3, adjust type 3

 

HRR-HTX; -11101; Forcast Rain; Sub 1; 0; 12

HRR-HTX; -11102; Forcast Rain; Sub 2; 0; 12

 

COMPUTE;  -11101; Rainfall; Sub 1 Adjusted Radar, GA; 0; -1, 1.0, 0.3, 0.0; P1*P2

COMPUTE;  -11102; Rainfall; Sub 2 Adjusted Radar, GA; 0; -1, 1.0, 0.3, 0.0; P1*P2

 

  

 

LOOKUP (Update 2/11/2015): The LOOKUP command will read a period of record from the HDB database and interpolate values from a rating table. A tab delimited rating table for FLOW and Gage-Height might look like this:

 

To convert a period of unit flows from the HDB a GetRealtime.exe setup line could look like this:

Lookup-unit-log-log; 2805; G.H.; Nashville Routed Subs

 

The HDB rtable line could look like this:

 

LOOKUP-Unit-Log-Log will convert the rating values to logs before interpolating the log value of the lookup.

LOOKUP-Unit-Linear-Log is lookup linear flows but with log G.H.

LOOKUP-Unit-Log-Linear is lookup log flows but with linear G.H.

LOOKUP-Unit is linear flows and linear G.H.

 

You can use my USGSrating2lookup.exe program to convert USGS Gh Shift Q ratings to lookup Q - GH files saved from here:

 http://waterdata.usgs.gov/nwisweb/data/ratings/exsa/USGS.09424447.exsa.rdb

 

  

Update 07/06/2015: Station_ID= IF

You can use the Station_id 'IF' in GetRealtime_setup.txt select PARAMeters (routing filename) or to SKIP (or NOskip) the next N stations on your setup like this:

 

 IF; -1331; Flow Check; Sub 11 Flow Check For RAS Runs; rdaymax; SKIP-34; P1<1000

 

For this setup the next 34 stations on the setup list will be skipped if the maximum value in table rdaymax for the run period is less than 1000. The formula 'P1<1000' can be anything that evaluates to True or False. If table 'rdaymax' is used then SQL retrieval for 'P1' is automatically:  SELECT MAX(value) AS maxv FROM...   This would be handy to skip a HEC-Ras unsteady flow and mapping if the flow was not high enough to warrant it.

 

If table rdaymax is not used, then instead of the max for the full run period, only the first value for the period is used from that rtable. This could be helpful for running different runoff setups like unit graph type or interception for very dry base flow conditions:

IF; -1300; Flow Check; Routed Base Flow for dry setups; rhour; SKIP-4; P1<80

 

But in this case you will need another 'IF' in case the first 'IF' is false:

IF; -1300; Flow Check; Routed Base Flow; rhour; SKIP-4; P1<80

COMPUTE-unit; -30301; Runoff; Oneida Cr Sub1, NY...

COMPUTE-unit; -30302; Runoff; Oneida Cr Sub2, NY...

************; ************; ************; *****Base Flow<80******

IF; -1300; Flow Check; Routed Base Flow; rhour; SKIP-2; P1>=80

COMPUTE-unit; -30301; Runoff; Oneida Cr Sub1, NY...

COMPUTE-unit; -30302; Runoff; Oneida Cr Sub2, NY...

 

To reverse the SKIP-2 if true to never skip if true use NOSKIP-2.  Sometimes it much easier to write a true logic than a false logic. For instance ALWAYS skip unless day is Monday, hour is 7, and minute is less than 5:

IF; 0; Date Check; Check Date Time is True; rday; NOSKIP-1; (Weekday(D)=2) and (Hour(D)=7) and (Minute(D)<5)

 

For very dry basin conditions try raising the SCS initial abstraction form 0.2 to 0.4 with your 'IF'. Raising interception will reduce groundwater so that might not be a good choice and IA will increase groundwater.

 

To select a routing filename from the rtable Parameter_Code string (f1.txt,f2.txt,f3.txt) then use the PARAM code like this for filenames other than defualt 1:

 

IF; 0; Date Check; Check Month For Route Filename; rday; PARAM-2; month(D)=3

IF; 0; Date Check; Check Month For Route Filename; rday; PARAM-3; month(D)=4

IF; 0; Date Check; Check Month For Route Filename; rday; PARAM-4; month(D)>9

ROUTE-unit; -1110; Total Relase; BBQ Dam Modpul Total Outflow

 

Note no dsid needed if only checking date D.  D is the current date and time.

 

To check if an Alert message was issued use 'alert' as the dsid:

IF; Alert; Flow Check; Was Alert Sent; 0; NOSKIP-5; 0

 

 

Update 11/07/2015: Station_ID= SET-INTERVAL

You can use the Station_id 'SET' in GetRealtime_setup.txt to set and reset the Batch minutes between retrievals like this:

 

SET-INTERVAL; -11611; Rainfall; Village Cr Big Sub Adj & Forecast; -60,runit; 15; P1>.001

 

For this setup the Batch minutes between retrievals will be set to 15 minutes if the maximum unit rainfall in rtable runit for dsid -11611 over the past 60 minutes is greater than 0.001 inches, otherwise it will be reset to the original Batch interval.

 

This is handy to only do run retrievals once an hour then if there is rain for the current run then set the next run to be in 15 minutes instead of 60. You will want to place the SET line below the current retrieval for that dsid so the current data will be available for checking.

 

The example here uses -60 look back, but you could do a +60 look ahead in table runit, or +1 in rhour. It only checks the maximum value in the rtable for the period from Now.

 

If you would like to go to even a more frequent batch run as the rainfall increases, then add another batch time and function test to the SET line like this:

 

SET-INTERVAL; -11611; Rainfall; Village Cr Big Sub Adj & Forecast; -60,runit; 15; P1>.001; 0; 5; P1>.2

 

The date of the next run will be writen to table rupdate field 'value' for sdid = -1. If you do a shell below the SET-Interval like SHELL-GetFromAccessNow the next run date will be available.

 

 

Rain Gage Wind Speed Adjustment (update 6/10/2014):

 

Dr. David C. Curtis, DC Consulting, Folsom, California, referencing available scientific literature concluded that a reasonable under catch estimate for unshielded ALERT tipping bucket rain gauges is one-percent per mile-per-hour of wind speed occurring at the height of the rain collector orifice for wind speeds of 30 mph or less. For higher wind velocities, the effects are more uncertain.

 

More info on rain gage siting and adjustments:

 

 http://tulliemet-perthshire.org.uk/main/images/InconsistentRainGageRecords.pdf

 

GetRealtime can adjust the 5-minute rainfall record by the formula P1*(1+P2/100) where P1 is the 5-minute rainfall and P2 is the 5-minute wind speed. For a Wundergage, it can be adjusted on download with the GetRealtime_setup.txt like:

 

It has been pointed out to me that P1/(1-P2/100) should be used:

     With P1*(1+P2/100)

     Precip = 0.85 * (1 + 15/100) = 0.98

 

     With P1/(1-P2/100)

     Precip = 0.85 / (1-15/100) = 1.00

 

KALBIRMI45;-9619; Rainfall Adj; Nucor Steel Wundergage; 0; 0; P1/(1-P2/100)

 

and the GetAccess table rsite for dsid -9619 as:

datatype_id= 10

parameter_code = dailyrainin,WindSpeedMPH

 

The rainfall need not be stored before adjustment but could also be done that way with 'Compute-unit'.

 

 

 

Additional menu options available when window maximized:

 

 

Additional menu items when 'Scheduled Batch' is checked:

Uncheck 'Scheduled Batch' after setting options to run normally.

 

 

Metric Units (maximize window and set checkbox and save setup):

1--Radar rain saved as millimeters.

2--NWS-Forecast rain millimeters, Temp C, Dewpoint C, Wind KMH

3--Runoff input rain and ETo (if supplied) as mm's; saved as m^3/s.

4--Modpul routing rating table as m, m^3/S and m^3.

5--Snowmelt inputs C, KMH, watts/m^2, mm; output mm's.

6--Radiation and ET inputs C, KMH; output watts/m^2, mm.

 

What I found needs changed from English to Metric:

1--Wunderground metric set with Temp C on Internet Explorer (not Chrome) on website.

2--Gage/Radar ratio hail cap raised from 0.5" to 12.7mm.

3--Runoff coefficients NO CHANGES!!! base flow, area, and losses must be in cfs, sqmi, inches.

4--Modified Puls rating columns are M, CMS, M^3 (ft*0.3048, cfs*0.028317, af*1233.48)

5--Convert GetGraph's rainfall frequency tables, GetGraphs_Frequency.txt, to millimeters and use log Y-axis scaling.

 

I have not done much testing of the metric units so be careful and let me know if you spot a problem... better yet, learn to speak English, comrade!

 

 

GetRealtime.exe Configurations and Setups:

Update 1/25/2014--If you have made it this far, then it is time to get organized. Here is a 15 subbasin setup example using Excel that puts some order to all this. Another tip of the hat to David in Alabama.  GetRealtime and GetGraphs setup files can now be pasted form Excel in Tab delimited text format

 

Optional ways to run GetRealtime and the setup file  for mixing retrievals:

 

1) To SKIP a station include an '*' with the first cell Station Code setup.

 

2) Move your unused stations below the END. (ex: snow seasonally)

 

3) Put all the runoff at the bottom and use a select on the upper.

 

4) What I do, create a separate installation for:

   a) Wunder gages, run daily for 1 day.

   b) Runoff, run for as long and when needed.

   c) Radar, auto continuously every 1/2 hour (I am comparing old N1P) or more often as needed.

 

5) With one installation, run auto continuous radar with selection, and then daily or as wanted, start up GetRealtime.exe again select just the daily's. (GetAccess wont let you update the same table like runit at the same time so you have to look at the auto radar last run time by hovering your mouse over it.)

 

The files are WHERE?!?!? 

GetRealtime.exe needs the point file (and boundary) in it's folder. I set up my GetNexrad.exe to use the GetRealtime folder that does all my radar retrievals for saving it's radar images. That way GetNexrad and GetRealtime have the same boundary and point files.

 

If you don't want to go the trouble of building an Excel workbook to construct your GetRealtime_setup.txt file and table Rsite, then build your GetRealtime_setup.txt file first. Then you can select each line and add most of the needed info to table Rsite. Also some Wizards will add all the needed info to table Rsite based on the setup line.

 

I set up a separate GetRealtime.exe and folder for Wundergages and computations. Update: Not sure why?

 

I don't do anything with C:\Program Files\. No files are registered anywhere but the System.

 

You can install them over and over in C:\Program Files\ and then copy what is new to all your working folders. If you want to stay in C:\Program Files\ then just have GetNexrad save in the GetRealtime folder where all the boundary and point files will be.

 

Maybe you can think of a better way for this like:

C:\GetRealtime\GetRealtime (w/boundary, point files)

C:\GetRealtime\GetNexrad (save images in GetRealtime)

C:\GetRealtime\GetGraphs (with GetFromAccessNOW.exe FTP automation)

C:\GetRealtime\GetAccess (the MDB file can be anywhere if shared but usually here)

 

To have GetNexrad use the GetRealtime folder, select any image from the GetRealtime folder even if it is a legend GIF for now. Then 'Save Setup' and GetNexrad will then be using in common the GetRealtime folder.

And radar need not be retrieved when there is no rainfall. Runoff computations will fill in zero rainfall for missing periods of rainfall in the 5-minute record.

 

If you have made it this far, then it is time to get organized. Here is a 15 sub setup using Excel that puts some order to all this. Another tip of the hat to David in Alabama.

 

  

That is all....

And again, it's important that the MS Access HDB database be routinely 'Compacted and Repaired' to speed up downloads. The size of the MS Access database doesn't seem to matter as much as how often data is being updated. I retrieve about 100 stations every 1/2 hour with 5 minute unit values (Nexrad). I find the down load time will double in about 2 to 3 days so I 'Compact and Repair' using the GetAccess.exe 'Connection' button every day or two... so be smart like me. Look at your 'GetRealtime_errlog.txt' file to see how your download times vary over time.  Also see 'Defragment or Compact First?' topic near the top in the GetAccess section. GetAccess 2.1.3 will now check for fragmentation of the MDB database file and display it on the window caption and can defragment it if wanted.  After further review I think all this must be an MS Access urban legend. I have not seen any change in download times, but it is still a good idea to do it anyway and be smart like me so you will have a backup of your database.  And get a $10 USB memory stick!

 

My MS Access database is 200 MB's with about one years worth of data. Wikipedia says MS Access 2003 can handle 2 GB's or 10 more years. We shall see. MS Access 2010 still uses our 2003 file format.  One option to reduce the database size is save the database file every year and then delete all the unit values and keep going so historical unit values would be available if ever really wanted in separate files. (as if Windows will never die, remember DOS)

 

And my term 'HDB' stands for something I acquired from a bygone era, Hydrologic Data Base. I should use the MS Access file name MDB... which stands for... MYHydrologic Data Base... or something.

 

Good luck … and with any luck…

Rainy days are here again!!!

END

 

Some additional page links about ET and Nexrad Radar help and comparisons:

List of How To Videos on Youtube
Help Page for GetNexrad.exe
Nexrad Rainfall to Tipping Bucket Comparison
ET and Radar Rainfall along the Lower Colorado River, AZ-CA
Nexrad vs Tipping Bucket and  Rainfall-Runoff Comparison Blue Ridge Mtns, NC
Nexrad Rainfall-Runoff Comparison Las Vegas Valley, NV
Nexrad Rainfall-Runoff Comparison San Jouqin Valley, CA
Nexrad Rainfall-Runoff Comparisons in northwestern Arizona
Nexrad Snowfall Comparisons in western central Sierras, CA and Fargo, ND

 

 WEBSITE MAP


                          
WU  USGS  USBR    NWS    GoC  USCS  CWRD  COE


More Free Downloads

 

Comments/Questions

Email: carson@getmyrealtime.com
Label

 

 
Home About Free Downloads Register Help Videos More Stuff Site Map