You are on page 1of 71

AN ANALYSIS OF TERRAIN ROUGHNESS: GENERATING A GIS APPLICATION FOR PRESCIBED BURNING by Matthew A. Crawford, B.S.

A Thesis In Range Science

Submitted to the Graduate Faculty of Texas Tech University in Partial Fulfillment of the Requirements for the Degree of MASTER OF SCIENCE

Approved

DR. ERNEST B. FISH Co-Chairperson of Thesis Committee

DR. CARLTON M. BRITTON Co-Chairperson of Thesis Committee

DR. KEVIN R. MULLIGAN

DR. FRED HARTMEISTER Dean of the Graduate School

May 2008

Copyright 2008, Matthew Crawford

Texas Tech University, Matthew Crawford, May 2008 ACKNOWLEDGEMENTS


I would like to thank Dr. Ernest B. Fish, Dr. Carlton M. Britton, and Dr. Kevin R. Mulligan for serving as my committee members. Throughout this project you have provided constant support, assistance, and perhaps most appreciated, encouragement. I would like to extend a special thank you to Dr. Ernest Fish for giving me the opportunity to take on this project and complete the requirement for my Master of Science. Dr. Fish, as my committee chairman and employer, you have been an inspiration to me. Not only were you essential as my guide during this project, you have served as a mentor and a friend. I very grateful for all of the advice and patience you have given me over these few years. Thank you very much. This project could not have been completed without the great assistance, advice, jokes, and encouragement that I had from the faculty and staff in the Department of Natural Resources Management. I would also like to thank the students (graduate and undergraduate), Tyler Hawkins, Kiran Masapari, Matthew Butler, Mindy Rice, Susan Rupp, Michael Howard, and Matthew Tredennick for their assistance in brainstorming, collecting data, and computer programming. Most of all I would like to thank my wife, Stephany, and two beautiful children, Jessica and Jacob. They are the most important people in my life. My family has kept me going over the years, when I was discouraged and wanting to give up at times, they assured me that I would get finished one day. They have always been a comforting, loving and encouraging factor in my life.

ii

Texas Tech University, Matthew Crawford, May 2008 TABLE OF CONTENTS

ACKNOWLEDGEMENTS ABSTRACT LIST OF FIGURES I. INTRODUCTION A. BACKGROUND B. PURPOSE C. SCOPE II. LITERATURE REVIEW A. TERRAIN ROUGHNESS B. TRAFICABILITY C. LEAST-COST PATH D. SUMMARY III. METHODS A. STUDY AREA B. OBJECTIVES C. ANALYSIS IV. RESULTS A. INTRODUCTION B. SCALE RESULTS C. STUDY RESULTS D. SUMMARY

ii v vi 1 1 3 4 5 5 7 10 13 15 15 18 18 21 21 21 26 36

iii

Texas Tech University, Matthew Crawford, May 2008 V. CONCLUSION A. INTRODUCTION B. DISSCUSION C. SUMMARY LITERATURE CITED APPENDICES A: PROCEDURE B: TRI.AML C: TRI.VB 38 38 38 40 41 43 43 51 57

iv

Texas Tech University, Matthew Crawford, May 2008 ABSTRACT Prescribed burning is a technique used to rejuvenate pastures by enhancing wildlife habitat, brush control, and removing old growth. The technique has become a science and has been in practice for decades to model naturally occurring fire regimes. Planning a prescribed burn is a detailed and careful formula that requires a great deal of time and preparation. This study presents a procedure that will greatly reduce the amount of time and money spent in planning a burn. Fire lines are typically located along pasture fences for prescribed burning in Texas. In rough, hilly terrain this results in fire lines traversing steep slopes and deep canyons that greatly increase expense and pose hazardous conditions for personnel. By combining the power of technology with the knowledge of a burn expert, an innovative approach to fire line location may possibly be developed, using existing programs to build a model that predicts the smoothest and most suitable path for fire lines. In order to accomplish this task, several steps were taken. First, an application was found that provides the desired algorithm to calculate a roughness surface from a digital elevation model. The roughness surface is then classified by a newly suggested classification index. An application has been developed to use the roughness surface to obtain isoline locations for the burn area. The Rocker Ranch which is located on the edge of the Llano Estacado escarpment in Borden County, Texas provided an excellent area for testing the model. The fire lines are based on the isolines which represent the smoothest route, within the designated area. This application will optimize the fire line planning process for prescribed burning by saving time and money. v

Texas Tech University, Matthew Crawford, May 2008 LIST OF FIGURES 3.1 3.2 3.3 4.1 Map of the location of the Rocker Ranch, in Borden County, TX Photograph of the study site, showing vegetation and topography A USGS topographic map of the study area Map of Terrain Ruggedness Index surface with 100-meter cell size categorized by (Riley et al. 1999) 4.2 Map of Terrain Ruggedness Index surface with 1-kilometer cell size categorized by (Riley et al. 1999) 4.3 Map of Terrain Ruggedness Index surface with 10-kilometer cell size categorized by (Riley et al. 1999) 4-4 4.5 4.6 Map of Borden County digital elevation model Terrain Ruggedness Index analysis run in Arc/INFO Workstation Map of Borden County Terrain Ruggedness Index surface categorized and symbolized according to (Table 4.1) 4.7 Map of Rocker Ranch study area Terrain Ruggedness Index surface categorized and symbolized according to (Table 4.1) 4.8 Map of isolines created in ArcGIS with the Terrain Ruggedness Index surface for the Rocker Ranch study area. 4.9 Map of isolines selection based on the first class of index values, 0.00 ft. 3.00 ft on the Rocker Ranch study area 4-10 Map of preferred isolines based on a cost effective decision for the Rocker Ranch study area 33 32 31 30 29 24 27 28 23 22 16 16 17

vi

Texas Tech University, Matthew Crawford, May 2008 4.11 Map of preferred isolines, ranch boundary and NAIP image exported for field use 4.12 4.13 Map of final fire line location modified after field trip Map of final fire lines exported to GPS receiver for navigational purposes 36 34 35

vii

Texas Tech University, Matthew Crawford, May 2008 CHAPTER I INTRODUCTION Background There is an abundance of models for terrain analysis, yet many of these are designed to do a specific task, such as calculating terrain roughness, determining trafficability through a particular terrain, or finding the least-cost route from one point to another. This study will attempt to combine various aspects of these models into the possible formulation of a single geographic information system (GIS) application. This application will use digital models to calculate terrain roughness as a factor for use in predicting several possible fire line paths of least-cost to assist in decision-making processes for prescribed burn planning. Prescribed burning has been a common practice for many years. Locating fire lines was an easy task as they were located along pasture fences. Installing black lines to contain the main head fire along pasture fences has always been a difficult, time consuming, and dangerous undertaking. The objective is to use a combination of existing technologies to plan and locate a least cost, smooth fire line path around a burn unit. Prescribed burning is a science in itself. There are several variables that contribute to a final decision about a fire line. These include line widths, direction of wind and its speed, the type of fuel being burned, and time of year. Determining where a fire line is placed and its width has always been a manual procedure, which can take a considerable amount of time. In order to expedite this practice, this least-cost model will automate this part of the planning procedure, ultimately saving time and money.

Texas Tech University, Matthew Crawford, May 2008 There are many types of firebreaks used in prescribed burning. Mineral lines and black lines are among the most common. Roads, water bodies, and fire retardants are also used for controlling fires. A mineral line is a strip of land bulldozed so that the mineral soil is exposed, and any fuel is removed. It is usually only 8 -10 feet wide, about the width of a bulldozer's blade. Black lines are burned buffer zones, burned into the wind, where fuels have been along the inside of the mineral line. These black lines vary in width from 100 -700 feet, depending on the conditions of the prescribed burn. Terrain analysis refers to the symbolization and representation of the land surface in an area or region. Terrain analysis is an umbrella term used to categorize several different forms of terrain interpretations, including terrain roughness, slope analysis, and aspect analysis. Most often, terrain analysis deals with the interpretation and/or manipulation of a digital elevation model (DEM). A DEM is a digital grid surface representation of elevation data, usually at 30-meter resolution. Each grid cell contains a value which is compiled into a database and can be built as a 3-D surface. There are several uses for a DEM, the most common of which are to calculate slope and aspect of an area. Slope is usually expressed as the degree of angle at which a surface inclines or declines. Aspect is a measure of the direction in which the sloped area of the land surface faces. Terrain roughness has long been useful topic of study. Most of the models for this type of analysis are at a micro-terrain level, a scale which is too detailed for this study. There are three principles on which this study is based; terrain roughness, terrain trafficability, and least-cost path. These three principles are found in almost every terrain analysis study. In this study, they are all intertwined and should build upon each other in 2

Texas Tech University, Matthew Crawford, May 2008 order to find the best route across any area. This new procedure will generate a digital model of an area and draw the best possible fire lines around the perimeter of that area. This process should become a useful and simple tool for collecting information and quickly determining the correct management scheme. Trafficability is related to the other two types of analysis because it calculates the mobility of a vehicle through the terrain. It incorporates terrain roughness and a least-cost analysis to calculate transportation routes. This is one of the main focuses of the military in terrain analysis. They use models to determine their ability to move through an area, specifically for tactical maneuvers. Trafficability has relevance to this study because in order to successfully create fire lines, a bulldozer or tractor must be able to navigate across terrain in a region. There are many applications for finding a least-cost path. There are even some designed to calculate terrain indices like the ones proposed in this research. One of these calculates least-cost corridors between habitats (Walker and Craighead 1997). This study will evaluate the corridor model as well as several other models for similarities with the proposed model. The procedure is based on the principles in some of the applications.

Purpose The purpose of this study is to create a useable and time conserving means to aid in the planning of fire lines. The model will generate possible fire lines around the perimeter of a proposed site. Generating perimeter lines will allow for this analysis tool to be used virtually anywhere. As wind directions vary across the country, the placement of the fire lines is essential for a successful burn. 3

Texas Tech University, Matthew Crawford, May 2008 Scope This study incorporates previous unpublished work done in Texas Tech Universitys Department of Natural Resources Management. It also utilizes the principles mentioned above. The procedure will extract elevation values from a DEM and pass them through a series of algorithms, called the Terrain Ruggedness Index (TRI) to provide an array of indices. The end result is a new raster containing the indices which are used to establish a scale on which to base terrain roughness for the exclusive purposes of prescribed burning. This analysis of terrain roughness will become a new cost surface. The cost surface will be analyzed and processed to provide isolines producing a series of least-cost paths. The paths will determine the most cost effective locations to create a fire line for prescribed burning. Each step of this analysis builds on information and principles developed in the previous one. For example, in order to calculate trafficability, the process must include information about terrain roughness. This is also true with least-cost planning, which must include information about trafficability and terrain roughness to be accurately calculated. The process will become an indispensable tool in finding a simple solution for collecting information and quickly developing useful management alternatives.

Texas Tech University, Matthew Crawford, May 2008 CHAPTER II LITERATURE REVIEW

Terrain Roughness Terrain roughness is a term used to describe how irregular an area is. This is frequently associated with trafficability, in terms of path planning. This type of analysis has been used for military mapping and rover exploration on Mars. In many cases the process of determining terrain roughness (or smoothness) is very complex. There are several examples that attempt to simplify this type of analysis, but many times the models created are somewhat incomplete. These models often omit an important step or variable that could lead to unexpected or misleading results. They are also created for a specific purpose, and therefore are typically not flexible. In an attempt to get away from conventional surveying equipment, Roma and Herman (1964) conducted a study to experiment with an alternative method for calculating terrain roughness. This experimental technique to profile micro-terrain was composed of cumbersome equipment and bulky computer devices (though considered very high-tech at that time). The equipment used a terrain profilometer attached to a Jeep. The equipment took readings while the Jeep traversed the terrain. The Jeep could be driven over rugged terrain with up to sixty percent grades and thirty percent side slopes. The profilometer was capable of calculating terrain features within an inch. There is an obvious desire for a better system of calculating terrain roughness. Several models are available for processing terrain roughness calculations, most of which are not designed to accurately calculate values that encompass all of the variables needed 5

Texas Tech University, Matthew Crawford, May 2008 for such a complex analysis. Hoffman and Krotkov (1989) give a criterion for having a desired outcome of values, which must differentiate between surface amplitudes (how high), frequencies (how often), and correlation (how diverse). The desired measurement must be an intrinsic property of the surface, constant with respect towards rotation or translation. It must also be on a local scale of measure, as opposed to global. Lastly, it must have intuitive or physical meaning. Hoffman and Krotkov (1989) proposed a technique that is capable of achieving the desired variables in the output of terrain roughness analysis. The Hoffman and Krotkov (1989) technique incorporates Fourier analysis, which is used in studying geological surface irregularities measured from a fixed point that displayed internal differences of elevation from seven centimeters to three meters. This method measures roughness along specific directions of a surface and includes amplitude, frequency, and autocorrelation terms (Hoffman and Krotov 1989). Hoffman and Krotkov (1989) state that this method provides the vector estimation of roughness necessary for this model. It does not, however, measure intrinsic surface roughness. Hoffman and Krotov (1989) extended their work by transforming the surface and localizing it for the purpose of measuring intrinsic values at a regional classification. The Fourier analysis does not measure intrinsic roughness because it depends on the direction of measurement and is influenced by the rotation of the surface. Using this process with elevation maps shows that this method can identify regions of relative smoothness (Hoffman and Krotov 1989). Another study describes the design of an application for the purpose of calculating drainage density from Digital Elevation Models. The application is called a Spatial 6

Texas Tech University, Matthew Crawford, May 2008 Analyzer Model (Moreno et al. 2003). This model was designed to create different cost surface layers, such as a terrain ruggedness layer and a drainage density layer for overlay analyses (Moreno et al. 2003). The terrain ruggedness layer was generated by using a series of algorithms applied to a DEM. This model is used to represent the amount of elevation difference between adjacent cells of a DEM. This process calculates the difference in values from a center cell and the eight cells immediately surrounding the center cell (Moreno et al. 2003). These calculations give a value which is then compared to a pre-determined scale on a terrain ruggedness index The approach that we took significantly decreases the amount of time and effort to quantify selected terrain characteristics (Moreno et al. 2003).

Trafficability Trafficability is a relational measurement between a vehicle and its ability to move through a particular area or region. Most work on this topic has been done by the U.S. Military. The military is primarily concerned with the movement of troops and vehicles across all types of terrain. The military uses terrain data collected in the field to create overlay transparencies. These transparencies group similar areas together based on three categories of maneuverability: unrestricted, restricted, and severely restricted. These transparencies are then placed on top of topographic maps for better visualization of navigability (Donlon et al. 1999). Newer studies have attempted to reproduce and evolve this procedure into working digital models. One study explains the process to produce a qualitative reasoning approach to the trafficability problem. They outline steps taken to 7

Texas Tech University, Matthew Crawford, May 2008 implement and test their system. They created several domain theories or rules, in order to incorporate variables into the algorithmic calculations. They used several variables in this system to calculate trafficability for a test area, including: terrain roughness, hydrology, soil stability, slope, and vegetation (Donlon et al. 1999). The domain theories address knowledge of the terrain, the vehicles' capabilities, and how the terrain affects the vehicles. Domain theories were also created to understand the digital representations of the terrain. They state that trafficability is a binary relationship between the terrain and the vehicle, which can be expressed quantitatively or qualitatively. Qualitative expressions of trafficability are expressions of whether a vehicle is capable of effectively traveling through a region and the qualitative effects the terrain will have on that movement (Donlon et al. 1999). The results from their study seemed to be a success. All of the calculations were compared against the U.S. Army's traditional calculations and they proved to be consistent, according to the authors. The purpose of this study was to enhance calculation speed of trafficability, as opposed to traditional calculations done by analysts that take hours. The authors suggest that this model be further tested. They state that the terrain used in the test was relatively smooth. Therefore, this model would need to be tested with rougher terrain data to become a standardized tool. This model also lacked information about soils, surface configuration, and terrain roughness. Slocum et al. (2002) developed a Trafficability Analysis Engine in order to help speed up the manual process of the US Army's Modified Combined Obstacles Overlay technique. There are several appealing aspects of this model that take into account 8

Texas Tech University, Matthew Crawford, May 2008 variables not considered by other models in this type of analysis. The system degrades gracefully as terrain data is missing, providing a best estimate along with a confidence rating provided with each guess (Slocum et al. 2002). Another well thought out design to this model is that it is intended to be user-friendly. The system server calculates all the variables and produces the results for the user, but also allows advanced users to manipulate or alter the input. Slocum et al. (2002) created a Java-based prototype model to take into account several aspects of geography: location, terrain, topography, and vegetation to name a few. Each of these is contained in its own module, being calculated and weighted individually before being combined into the final analysis. Each cell in the grid is given a value for each module. This system is able of computing irregularly shaped areas or grids. This prototype cannot be used in this study because it cannot be based solely on just length and width of an area. Another example is a concept for a GIS based terrain mobility model with optimization of off-road routes (Suvinen 2003). This concept of generating a cost surface is based on machine type, tree coverage, road features, and weather data. Suvinen (2003) explains that terrain mobility first depends on the soils' capacity to resist forces exerted by a rolling wheel. Again this is a structured object-oriented program with several modules or overlays. These modules are ultimately added together and employed to determine a cost surface. Suvinen (2003) considered obstacles in this model to be tree coverage, water content of the soil, large rocks, water features, roads, and power cables. All of these objects are important variables to take into account in the model. He implemented this 9

Texas Tech University, Matthew Crawford, May 2008 model, giving it the ability to be adapted to changing information on objects. Newer technology has provided the opportunity to incorporate this type of GIS analysis into several professions that did not previously utilize this information. An example of such a model was developed in Switzerland, called the Trafficability Evaluation System (TES) (Eichrodt 2003). This system provides the necessary support when planning machine employment and optimizing harvesting operation layouts (Eichrodt 2003). It was developed in order to determine the trafficability of forestry equipment through the rough terrain of Switzerland forests. Three scenarios were tested in this study, in order to evaluate the model's validity. The first was a comparison between the trafficability of wheeled vehicles and tracked vehicles. The second tested various tire inflation levels and how they affected the trafficability of the wheeled vehicles. Third, trafficability was compared between two dates, to compare the difference between different water content levels in the soil. The author concluded that this study demonstrated that the TES provides acceptable accuracy when comparing water-content and technical vehicle mobility (Eichrodt 2003). These examples show some of the progress achieved in mapping trafficability. The models vary in design, but all are intended to find the same basic answer. Most of these models can be adapted for use with other types of analyses.

Least-Cost Path Least-Cost Path is an analysis tool used to determine a route, corridor, or path across a surface in which the cost will be the least expensive to travel over a distance between two points in a given space. Examples of this could include a cross country trip 10

Texas Tech University, Matthew Crawford, May 2008 in the shortest time, a wildlife corridor between habitats requiring the least expenditure of energy, and finding the smoothest path through a stretch of terrain. All of these examples use a different variable for cost, such as travel time, energy expenditures, or terrain smoothness. The least-cost path procedure for routing linear features was one of the earliest advanced applications of Geographic Information System Technology. Least-cost analysis is one of the oldest spatial problems in history. The manual approach to this analysis, although based on expertise, lacks defendable documentation. Using a GIS for this type of analysis can not only enhance calculation of the results, but ensures repeatability by standardizing the procedure (Berry 2000). One study of least-cost analysis is based on wildlife movements through corridors in the Rocky Mountain region of Montana. The study was initiated to identify potential priority areas for wildlife management to improve the connectivity between protected ecosystems (Walker and Craighead 1997). Concern regarding the fragmentation of habitats and greater urban sprawl, sparked an interest in preserving habitats for a few key species in this area. For Walker and Craighead (1997), delineating corridors at a regional scale entailed determining which routes offer the best chance of survival if the animal were to disperse between core protected areas. A priority of this study was to be able to leave the best corridors open to wildlife without forcing the animals to alter their preferred route. The authors list several assumptions about the study involving wildlife and humans. The last assumption states that the least-cost path offers the animal the greatest probability of survival, because the least-cost corridors would be the shortest distance and the safer path to choose (Walker and Craighead 1997). 11

Texas Tech University, Matthew Crawford, May 2008 There are three GIS coverages used in this analysis: habitat quality, amount of and type of vegetation coverage, and road density. The first two are rated by habitat preference, vegetation quality, and spatial patterns. The third is rated based on estimated use of the area by the wildlife. The study found several (primary and alternative) routes for each of the three species considered, but these corridors may not be the paths of least resistance. Delineating a 'least average cost' path, however, would be much more computationally demanding, due to the number of possible cell-path combinations in a large region (Walker and Craighead 1997). Least-cost path analysis consists of three major components: discrete cost, accumulated cost, and steepest path. Each of these components is a step in the procedure. Discrete cost is the most critical because it is a summary map of several calibrated and weighted maps. The first step is the development of a discrete cost surface (or series of surfaces) which indicates the preferences for routing at every location. The second step is the generation of an accumulated cost surface that characterizes the optimal connectivity from a starting location to all other locations based on preferences. Identification of the path of least resistance from a desired end location is the last step (Berry 2000). These steps ensure a complete, accurate analysis for finding least-cost. These paths that are found through the analysis process are to assist in the decisionmaking process of projects or studies. Once one or more paths have been found, the process of making a decision becomes much clearer. Mapping out the paths through a GIS allows the user or decision maker to visualize several options within an area quickly.

12

Texas Tech University, Matthew Crawford, May 2008 Summary All of the models found in this review have simplified a complex problem. The military has been concerned with trafficability since the dawn of man. Territorial defense has always been an intensely studied topic. Newer studies have attempted to refine and simplify trafficability with digital models. Donlon et al. (1999) were able to digitally enhance the conventional models used by the U.S. Army. Their model allowed faster calculations of output. Slocum et al. (2002) had also created a model similar to the Donlon model. It too, was developed in order to expedite the manual process the U.S Army generally uses. Both of these examples have shown some progress beyond the traditional overlay process. Terrain roughness modeling has also been frequently utilized for military purposes. This extremely complex process for determining surface roughness has even been used as a tool for path-planning for rover exploration on Mars. Roma and Herman (1964) attempted to move away from traditional surveying equipment. They developed a method involving a Jeep and combersome computer equipment. Their method works, but is long since outdated. Hoffman and Krotkov (1989) explain that terrain roughness measurements must differentiate between amplitude, frequency and correlation. Their method measures roughness along specific directions on a surface and includes these three factors. Another model developed by Moreno et al. (2003) was designed to create cost surfaces to be applied to overlay analyses. Least-cost is an analysis process that is capable of incorporating overlain cost surfaces to determine the least expensive route given specific input variables. Berry (2000), states that the digital process lacks the expertise of manual procedures, but is defendable and can be documented. Walker and 13

Texas Tech University, Matthew Crawford, May 2008 Craighead (1997) conducted a study in Montana that was based on wildlife movements through the Rocky Mountains. Their main priority was to leave the best corridors open to wildlife without forcing animals to choose one corridor over another. The results of this study showed multiple routes to choose from. There are several steps to take in order to ensure a complete and accurate analysis. The first is to create a series of cost surfaces based on the objectives of the study. Secondly, create a cumulative surface from the cost surfaces. Third, is to identify the path of least resistance from a desired end point (Berry 2000). These examples have shown some of the progress made in digital terrain analysis. These models vary in design, but are intended to find the same basic answer. Based on this review it becomes apparent that developing a program or methodology to find the best placement for fire lines can be accomplished. Therefore, this study will use information and tools presented in this review to assist in the creation of a procedure that will create the fire lines. This procedure will be able to aid in the planning process, and allow the time spent in the field to be greatly reduced.

14

Texas Tech University, Matthew Crawford, May 2008 CHAPTER III METHODS Study Area
The main study area for this roughness tool is in Borden County, Texas (Figure 3.1). It is located on the Rocker Ranch owned by Tim and Carol Wilson. The site consists of around 6.5 km and ranges in elevation from 811-meters to 890-meters. This particular part of the ranch is adjacent to the Llano Estacado Escarpment, in some areas the elevation increases dramatically over a short distance (Figure 3.1). The burn is tentatively scheduled for February of 2009. This prescription will provide the necessary instructions for the burn. Included in this prescription are details of the fire effects on flora in the burn area. After examining the site, several species of interest were identified. One main concern in this area is the amount of saltcedar (Tamarix aphylla). This species had choked up a small creek on the ranch, and has been chemically treated. The most abundant woody species on the site are honey mesquite (Proposis glandulosa) and redberry juniper (Juniperous pinchotii). Other vegetation include annual broomweed (Gutierrezia dracunculoides), little bluestem (Schizachyrium scoparium), blue grama (Bouteloua gracilis), sideoats grama (Bouteloua curtipendula), lotebush (Zizyphus obtusifolia), sand-shinnery oak (Quercus havardii), and various species from the Asteraceae family. This area of the Rocker Ranch has very mixed terrain; there are flat canyon bottoms that are surrounded by the walls of the Llano Estacado Escarpment (Figure 3.2). These walls are very steep and are to be excluded from the burn. Figure 3.3 is a topographic map of the area. Slopes range from 0 122% varying greatly throughout the area. Soils in this area range from rocky to clay loams. The burn areas all have a Berda Loam soil with 1 to 3% slopes.

15

Texas Tech University, Matthew Crawford, May 2008

Figure 3.1: Map of the location of the Rocker Ranch, in Borden County, TX.

Figure 3.2: Photograph of the study site, showing vegetation and topography.

16

Texas Tech University, Matthew Crawford, May 2008

Figure 3.3: A USGS topographic map of the study area. The particular purpose of the burn is to revitalize the area for quail and cattle. The ranch owners have expressed the desire to protect and possibly increase forage and shelter for the Bobwhite Quail. Hunting these quail is a popular activity in this area and can bring in substantial income. This burn will also revitalize the grasses on this site for cattle to graze, by reducing the amount of saltcedar, mesquite, and juniper.

Objectives

In order to reach the objectives of this study, an examination of the history of terrain analysis was conducted to better understand past and present patterns of this type 17

Texas Tech University, Matthew Crawford, May 2008 of analysis, as well as the various uses for this type of analysis. Then a series of investigations with the available applications and programs was conducted. After evaluating possible candidates from the literature review, an application was found in the Moreno et al. (2003) study, using a program written by (Riley et al. 1999). Lastly, I will attempt to incorporate this application into a ArcGIS interface. Within this general context, there are two specific objectives. The first is to evaluate the classification developed by (Riley et al. 1999), to test how well it performs in West Texas. The second is to develop a procedural methodology to map potential fire lines.

Analysis Comprehensive research has been conducted on the subject of terrain analysis which is contained in the literature review. This explains a brief history of the studies that involve trafficability, terrain roughness, and least-cost path. These applications have been designed for a wide range of uses, from habitat corridor analysis to the Mars rover mission. Many of these studies use a combination of the three similar types of analyses to achieve the desired outcome. The Riley et al. (1999) application measures the amount of elevation difference between adjacent cells of a DEM. The program was written in Arc Macro Language (AML), a language specific to Arc/INFO. The model was run on the Borden county DEM. The new grid, referred to throughout the remainder of this article as terrain roughness index (TRI) surface, was cut to the study area boundary.

18

Texas Tech University, Matthew Crawford, May 2008 There is a need to develop a procedural application into a form that will allow the user to be able to manipulate, analyze, and process data with ease. Considerations that have been addressed in this thesis are the type of programming language in which to design the application and which computer program will be the best platform to run the application. Software typically used in the majority of the literature reviewed was designed with various macro-languages. This is the case with the Riley et al. (1999) application. The GIS platform chosen for this application is ArcGIS 9.x. This program was chosen because of its user-friendly operation. Previous work on terrain roughness has been done at Texas Tech University (Fish 2004). This new model will employ the algorithms and data models used in their work. This model will be a complex application that will allow the transformation of terrain roughness into a cost surface. Digital Elevation Model data is most often packaged in 30meter resolution as a standard. After the TRI has run, additional steps are necessary to complete the procedure in order to determine the correct set of fire lines for any given area. These steps include creating contours of equal roughness for the area, selecting the desired contour, and exporting the file to a GPS unit that can be used to check in the field and while driving the bulldozer. The end-user can utilize this formula to calculate variation of the least-cost path model with ease. To address the first main objective of the study, a test must be completed to determine if the Riley et al. (1999) classification, shown in Appendix B, will work for the West Texas region of the United States. The most important piece of his application to be considered is scale. Determining scale is based on the ability of the vehicle to traverse an area. For this study, the scale is 30-meters. Over the years, especially since the 19

Texas Tech University, Matthew Crawford, May 2008 creation of the TRI application, the cell size of a DEM has changed dramatically, from 1kilometer to 30 or even 10-meters. The scale of the DEM seems to have a direct effect on the classification of the model. To investigate the validity of the TRI classification, the model was tested against the 30-meter DEMs of several counties in Texas as well as a DEM for the whole state. If, in fact the recommended classification does not hold true in any of these tests, the model will then be run against larger resolution DEMs. However, if this hypothesis is true then a new classification will be formulated for the 30-meter resolution DEMs. Once a scale index can be determined, the TRI application will be incorporated into a procedure that will predict real world fire line placement. This procedure will include running the program and then generating isolines to model fire lines that can be manipulated and adjusted once in the field. After the final decisions are made on fire line placement, the final path can be programmed into a GPS receiver and used as a navigation device for the bulldozer to cut the lines.

20

Texas Tech University, Matthew Crawford, May 2008 CHAPTER IV RESULTS Introduction The TRI program has become an integral component of this study. It calculates the differences in elevation from one grid cell and its surrounding cells. The program is able to create a new surface within seconds. Although, the script was not able to perform in an ArcGIS environment, the final goal of the study was achieved. Scale Results It became apparent after running the TRI program once that Rileys classification was not appropriate for my study area. After examining the product of this AML script, the TRI surface was separated into seven categories. This was done according to the index recommended by Riley et al. (1999). In order to test the values given by the index, this process was repeated for 25 random counties throughout the state of Texas. However, the scale was too large for anything found within the state, so the TRI application was tested on a DEM for the entire state of Texas. The results from the suggested classification proved not to be appropriate for the study area. The following three figures (4.1 through 4.3) depict an attempt to match the scale of the (Riley et al. 1999), had came up with his classification index. This test was successful in the fact that it showed the larger the cell size of the raster surface, the larger the index values became. For instance, with a 100-meter resolution TRI surface the largest index value was close to 120 (Figure 4.1), then with the 10-kilometer resolution TRI surface, the largest index value became 958 (Figure 4.3). 21

Texas Tech University, Matthew Crawford, May 2008

Figure 4.1: Map of Terrain Ruggedness Index surface with 100-meter cell size categorized by Riley et al (1999).

22

Texas Tech University, Matthew Crawford, May 2008

Figure 4.2: Map of Terrain Ruggedness Index surface with 1-kilometer cell size categorized by Riley et al (1999).

23

Texas Tech University, Matthew Crawford, May 2008

Figure 4.3: Map of Terrain Ruggedness Index surface with 10-kilometer cell size categorized by Riley et al (1999). Clearly then, the DEM cell size significantly impacts the values obtained for the classification and classification categories must be adjusted accordingly. Since the resolution of the TRI surface for this study is 30-meters, the classification index will be modified to fit the scale of this smaller cell size. The TRI application was run again on 25 random counties from the state of Texas. Most of the values for each new surface ranged 24

Texas Tech University, Matthew Crawford, May 2008 between 0 and 50. There were however two counties, Borden and Jeff Davis, that had maximum values of 110 and 268 respectively. These values mean that in some areas of Borden County, for example, for every 98.5 feet in distance there was an average elevation difference of 110 feet between a given cell and each of the eight surrounding cells. (This equates to approximately a 110 foot vertical change for every 98.5 foot horizontal change or a slope of 112% or 48 degrees). When prescribing a burn, slopes greater than or equal to 30% are automatically eliminated, because the risk of fire escaping and personal safety far outweigh the benefits of burning such an area. The new suggested TRI index for this type of application with a 30-meter grid cell is shown in (Table 4.1). The index is broken into seven categories like the index suggested by (Riley et al. 1999). In this case for this type of application the top category will be cut at values equal to or greater than 29.5. This number represents a 30% slope and is the absolute upper cut off point at which a successful burn can safely take place. Table 4.1: Suggested Terrain Ruggedness Index for prescribed burning applications with 30-meter grid cells. % Slope Index Value Classification

< 3% 0.00 3.00 Level 3% - 6.1% 3.01 6.00 Nearly Level 6.1% - 10.2% 6.01 10.00 Slightly Rough 10.2% - 15.2% 10.01 15.00 Intermediately Rough 15.2% - 20.3% 15.01 20.00 Moderately Rough 20.3% - 30.0% 20.01 29.50 Highly Rough > 30% 29.50 - Out of Range ________________________________________________________________________

25

Texas Tech University, Matthew Crawford, May 2008 Procedure Results Although no ArcGIS version can be used with the older AML programs, it is compatible with several newer languages, such as VB, VBA, C++, and Java. To be useful, the goal of this study was to get a program to run exclusively in ArcGIS, because the old Arc/INFO platform is obsolete. Attempts to convert the TRI into a newer language were unsuccessful. First, there is a script that can make an AML file in ArcGIS, but it utilizes Arc/INFO to run the program. Second, attempts were made to convert the AML script into a PYTHON script. PYTHON is a programming language adopted by ESRI to run in ArcGIS. Some material was found that contained PYTHON equivalents to AML keywords, but no keywords in the TRI were matched to the material. Third, through a trying and difficult process, a script was created in Visual Basic that performed virtually the same as the TRI The program, however, was unsuccessful in operating the VB code effectively in ArcGIS. Since The TRI script was designed to run only in Arc/INFO, another plan was formulated. The desired goal could be reached, though it would now become a more human decisive procedure comprised of several steps rather than the automated script originally envisioned. The procedure methodology developed for this study is comprised of several steps. The first step is to obtain the county wide DEM for the desired area; in this case the study area is the Rocker Ranch in Borden County (Figure 4.4).

26

Texas Tech University, Matthew Crawford, May 2008

Figure 4.4: Map of Borden County digital elevation model depicting the terrain. The second step of the procedure is to run the TRI program against the DEM (Figure 4.5). This is done in Arc/INFO, an older command-driven program predecessor of ArcGIS. This application creates a new surface raster that can be classified into roughness categories.

27

Texas Tech University, Matthew Crawford, May 2008

Figure 4.5: Terrain Ruggedness Index analysis run in Arc/INFO Workstation. Once the new TRI surface has been created, it can be added to an ArcGIS interface and categorized with the new classification derived in (Table 4.1). Then the raster is symbolized to these classification categories were developed for the 30-meter grid cell size. The result of the TRI analysis is shown in (Figure 4.6).

28

Texas Tech University, Matthew Crawford, May 2008

Figure 4.6: Map of Borden County Terrain Ruggedness Index surface categorized and symbolized according to (Table 4.1). The next step is to clip the TRI surface to the Rocker Ranch study area. The surface is clipped by using tools inside an ArcGIS platform. Figure 4.7 shows the clipped surface. The figure shows the Rocker Ranch study area categorized and symbolized as before.

29

Texas Tech University, Matthew Crawford, May 2008

Figure 4.7: Map of Rocker Ranch study area Terrain Ruggedness Index surface categorized and symbolized according to (Table 4.1). Next, TRI isolines for the ranch were created by using the Spatial Analyst extension. This is completed by selecting values equal to the category breaks of the TRI classification. Figure 4.8 is a depiction of the isolines created from this step with the Rocker Ranch TRI surface.

30

Texas Tech University, Matthew Crawford, May 2008

Figure 4.8: Map of isolines created in ArcGIS with the Terrain Ruggedness Index surface for the Rocker Ranch study area. The desired isolines, in most cases, will be equal to the most nearly level terrain (Figure 4.9). Selecting the desired isolines is ultimately up to the prescribed burn professional. In this case, for the purposes of the study, the isolines chosen equal the first class of index values, 0.00 ft. 3.00 ft.

31

Texas Tech University, Matthew Crawford, May 2008

Figure 4.9: Map of isolines selection based on the first class of index values, 0.00 ft. 3.00 ft on the Rocker Ranch study area. Since it is known that this catagory is in the valley, but also on top of the caprock, the most logical and cost effective decision in this case is to select just the isolines in the valley (Figure 4.10). There is not enough area on the caprock to justify burning on that portion of the ranch.

32

Texas Tech University, Matthew Crawford, May 2008

Figure 4.10: Map of preferred isolines based on a cost effective decision for the Rocker Ranch study area. Once the decision is made on the desired isolines, a map should be made to take to the field in order to compare the computer-aided decision to the real world. Figure 411 shows the selected isolines drawn on a 2006 aerial photograph from the National Agricultural Imagery Program (NAIP). This map will be very helpful in making decisions to adjust the isolines if needed to obtain the desired final fire line location. 33

Texas Tech University, Matthew Crawford, May 2008

Figure 4.11: Map of preferred isolines, ranch boundary and NAIP image exported for field use. Once in the field, changes to the isolines, if necessary, could be drawn directly on the map. Then in ArcGIS, the desired changes can be made to a final fire line location. Figure 4.12 is a map of the Rocker Ranch with the updated fir lines. Primarily, the isoline has been smoothed to eliminate costs associated with distance in fire line construction.

34

Texas Tech University, Matthew Crawford, May 2008

Figure 4.12: Map of final fire line location modified after field trip. The final fire line location is ready to be exported to a GPS receiver. For this study, the receiver used was a Trimble Geo XT. Trimble has its own software called Pathfinder Office, which was used to import the files from ArcGIS into the GPS receiver. There is also an ArcGIS extension called GPS Analyst that will do the same thing. Figure 4.13 shows the exported GPS file that will be used on a bulldozer as a navigation aide.

35

Texas Tech University, Matthew Crawford, May 2008

Figure 4.13: Map of final fire lines exported to GPS receiver for navigational purposes. The complete procedure can be found in Appendix A. It is a set of detailed, easy to follow instructions on exactly how to use the procedure.

36

Texas Tech University, Matthew Crawford, May 2008 Summary There was a need to change the classification index proposed by Riley et al. (1999) in order to fit the scale of the 30-meter resolution DEMs. After testing the TRI application on 25 Texas counties, a new set of classifications was developed for applications with prescribed fire. Then by using the new classification with the TRI application a procedure was formulated to locate the fire lines. By using the index, the TRI surface was divided into the seven classes determined to be appropriate for use with prescribed burning. Finally, isolines of roughness can be created and adjusted to model the desired fire lines within the area of interest.

37

Texas Tech University, Matthew Crawford, May 2008 CHAPTER V CONCLUSION Introduction The current method of planning a prescribed burn is a time-consuming and arduous task. By combining the power of technology with the knowledge of a burn expert, the time and effort currently required for planning an effective burn may be greatly reduced. This innovative approach to planning a burn incorporates existing programs to build a model that identifies the smoothest and most suitable path for fire lines.

Discussion Research has indicated that terrain roughness applications have been developed for a wide range of uses, from habitat corridor analysis to Mars Rover exploration. However, until now there has not been an application specifically designed for prescribed burning. The literature review revealed several methods of analyzing terrain. Some of the studies reviewed include developing directional terrain roughness, (Hoffman and Krotkov 1989), drainage density analysis (Moreno et al. 2003), trafficability navigation (Donlon et al. 1999), and wildlife movement (Walker and Craighead 1997). After reviewing these studies and searching for the most relevant model, Moreno et al. (2003) provided the best available program. Research led to a version found on the ESRI Support website. The Terrain Roughness Index calculated terrain roughness throughout a grid, by comparing a given cell to its eight surrounding cells. This program 38

Texas Tech University, Matthew Crawford, May 2008 is a simple comparison of elevation differences between cells. It does not calculate other factors such as aspect or slope. Testing the TRI on 25 random counties throughout the state of Texas led to a dilemma with the script and its classification of the produced raster. The range of the index values tied to the TRI was too broad to be effectively used on any of these counties or even at the state level. Therefore, the classification catagories were reduced based on a different grid cell size (scale). The TRI script as written in Arc Macro Language (AML) can only be run in Arc/INFO, an older command driven platform that can be purchased separately from ArcGIS. Therefore, in order to use TRI with the ArcGIS interface exclusively, the AML must be converted to a program language such as VBA, Visual Basic, or C++. Although methods of converting languages exist, several attempts were made to use these processes and they yielded results that were not viable. Therefore, the last option of re-writing the code was determined necessary. After the AML was changed into Visual Basic, the program was tested again to ensure it processed the data in the same manner as the previous version. Once it was evident that the program was operational, steps were taken to proceed into the next objective, moving the VB code into the ArcGIS 9.x interface. This was the end for this objective, as the code was inoperative once inside platform. The once planned all-in-one program now had to be developed as a set of manual procedures. The TRI application is run on a selected DEM in Arc/INFO to create the new terrain roughness raster. From there, the new raster can be brought into ArcGIS 9.x and processed. The terrain roughness surface is then categorized manually to the suggested index of new classes derived from testing the application on several counties. Isolines are 39

Texas Tech University, Matthew Crawford, May 2008 then created, also based on the users preference. One of these contours is selected based on the best criteria decided by the fire plan. Once this selection is made, the contour can be exported into GPS coordinates and checked in the field before cutting the fire lines with a bulldozer. Summary Prescribed burning is a science in itself. It can be a very difficult and time consuming to carry out a burn, much of which takes place during the planning stages. This method is intended to make planning a burn more cost effective, not to change any of the well established and proven techniques already in place. This new procedure will generate a digital model of an area and draw the best possible fire lines around the perimeter of that area. This process should become a useful and simple tool for collecting information and quickly determining the most suitable fire line location. Determining where a fire line should be placed can now be accomplished with a considerably decreased amount of time.

40

Texas Tech University, Matthew Crawford, May 2008 LITERATURE CITED

Berry, J. K. 2000. Applying spatial analysis and surface modeling in decision-making contexts. Proceedings of the 14th Annual Conference on Geographic Information Systems; 13 16 March 2000; Toronto, Ontario, Canada: Proceedings of GIS 2000. p. 1 7 Donlon, J. and K. D. Forbus. 1999. Using a geographic information system for qualitative spatial reasoning about trafficability. Proceedings of the 13th Inernational Workshop on Qualitative Reasoning; 6 9 June 1999; Lock Awe, Scotland: Qualitative Reasoning Workshop. p. 2 - 11 Eichrodt, A.W. 2003. Development of a spatial trafficability evaluation system [dissertation]. Zurich , Switzerland: Swiss Federal Institute of Technology Zurich. 184 p. Fish, E. B. 2004. Personal Interview. 16 March 2004. Hoffman, R. and E. Krotkov. 1989. Terrain roughness measurement from elevation maps. Society of Photographic Instrumentation Engineers, Mobile Robols IV 1195:104-114 Li, Z., Q. Zhu, and C. Gold. 2005. Digital terrain modeling. Boca Raton, FL, USA: CRC Press. 323p. Moreno, M., S. Levachkine, M. Torres and R.Quintero. 2003. Geomorohometric analysis of raster image data to detect terrain ruggedness and drainage density. Lecture Notes in Computer Science 2905:643 - 650 Riley, S. J., S. D. DeGloria, and R. Elliot, 1999. A terrain ruggedness index that quantifies topographic heterogeneity. Intermountain Journal of Sciences 5: 23-27 Roma, C. and H. P. Simon. 1964. An investigation of terrain roughness. Army Transportation Research Command. Fort Eustis, Newport News, VA, U.S.A: US Department of Defense, US Army. 30p. Slocum, K. R., J. Surdu, J. Sullivan, M. Rudak, N. Colvin, and C. Gates. 2002. Trafficability analysis engine. The Journal of Defense Software Engineering June: 28 30. Suvinen A., M. Saarilahti, and T. Tokola. 2003. Terrain Mobility Model and Determination of Optimal Off-Road Route. Proceedings of the 9th Scandinavian Research Conference on Geographical Information Sciences; 4 6 June 2003; Espoo, Finland: ScanGIS2003 Conference. p. 251 - 259

41

Texas Tech University, Matthew Crawford, May 2008 Walker, R. and L. Craighead. 1997. Least-Cost-Path Corridor Analysis Analyzing Wildlife Movement Corridors in Montana Using GIS. Proceedings of the 17th Annual ESRI International User Conference; 1997; San Diego, CA, USA. ESRI User Conference.

42

Texas Tech University, Matthew Crawford, May 2008 APPENDIX A PROCEDURE I. DATA AND TOOL AQUISITION i. Digital Elevation Model (Countywide, encompassing study area)NRCS (National by County): http://datagateway.nrcs.usda.gov/ ii. GIS Data Depot (Countywide by State): http://data.geocomm.com/ iii. TNRIS (Countywide for Texas): http://www.tnris.state.tx.us/ B. Hillshade (Countywide, encompassing study area) i. NRCS (National by County): http://datagateway.nrcs.usda.gov/ ii. GIS Data Depot (Countywide by State): http://data.geocomm.com/ iii. TNRIS (Countywide for Texas): http://www.tnris.state.tx.us/ C. DOQQ (Countywide, encompassing study area) i. NRCS (National by County): http://datagateway.nrcs.usda.gov/ ii. GIS Data Depot (Countywide by State): http://data.geocomm.com/ iii. TNRIS (Countywide for Texas): http://www.tnris.state.tx.us/ D. TRI.aml Program can be found in APPENDIX B i. ESRI Support: http://arcscripts.esri.com/details.asp?dbid=12435
ii. USDA: http://forum.manifold.net/Attachments/52/60498/TRI.AML.

II.

RUNNING TRI.aml A. Place all data and the TRI.aml program into C:\Workspace B. Open Program: Start>>Programs>>ArcGIS>>ArcInfo Workstation>>Grid

43

Texas Tech University, Matthew Crawford, May 2008 C. Type: &run tri input-dem output-dem; Example:&run tri borden_dem borden_tri, then press enter D. After TRI program has finished, type quit to exit Arc/INFO Workstation III. ARCGIS 9.X A. Add the county hillshade into a blank ArcGIS map B. Right click on the hillshade layer in the Table of Contents C. Select Properties a. Select the 4th tab, named Display

b. Select the drop down box that currently says Nearest Neighbor and change it to Bilinear Interpolation, this will smooth the layer to make it look continuous. D. Add the new TRI layer into the map E. Symbolize new TRI layer (it enters map in a black to white color scale) i. Right click on TRI layer in Table of Contents ii. Select Properties iii. Select the tab named Display a. Set the drop down box that currently says Nearest Neighbor and change it to Bilinear Interpolation b. Set the transparency to 30% iv. Select the 5th tab from the left named Symbology a. On the left side, change selection from Stretched to Classified

44

Texas Tech University, Matthew Crawford, May 2008 b. Under the Classification box, change the number of classes from 5 to 7. Seven classes was chosen to emulate the original the seven classes chosen by Riley et al. (1999) c. Leave the default classification at Natural Breaks v. Under the Classification box, lies the color ramp, use the drop down arrow to find the ramp that is from Green to Red. vi. Select OK IV. CLIPPING TRI OUPUT FILE A. Add the pasture boundary layer into the map B. Zoom into the area just surrounding the boundary layer C. Add the Spatial Analyst toolbar to the map by right-clicking in the empty gray space surrounding the map and select Spatial Analyst D. Select the Spatial Analyst button and go to Options i. Under the first tab, named General, select a working directory, type C:\\Workspace, but you may browse to any folder you desire ii. Set the Analysis Mask to the boundary layer, leave the other options as the default values iii. Under the next tab, named Extent, select the drop down box to say Same as Layer BounadryLayer iv. Under the last tab, named Cell Size, leave the default value as Maximum of Inputs v. Select OK G. Go back to the Spatial Analyst button and select Raster Calculator 45

Texas Tech University, Matthew Crawford, May 2008 i. Double click on the TRI layer to insert it into the calculator, then select the Evaluate button ii. The new clipped TRI layer automatically is automatically added into the map as a layer named Calculation H. In order to make this new layer permanent i. Right-click on the new TRI layer and select Data ii. Select Make Permanent iii. Browse to the desired folder to save the clipped TRI layer iv. Name it as desired v. Select save I. To symbolize the layer as before refer to Step III V. CREATING ISOLINES A. Select the Spatial Analyst button again, but this time select Surface Analysis then Contour. . . . i. A new interface opens, make sure that the new clipped TRI surface is selected ii. Change the contour interval to the maximum value of the first symbolized category (in this case 3.00) iii. Leave the other values as the defaults iv. At the bottom of the box, browse, save, and name the contour layer where desired B. Select OK

46

Texas Tech University, Matthew Crawford, May 2008 C. The set of contours created, automatically come into the map, for the example in this study one of the isolines follow just along the edge of the first green symbolized category VI. SELECTING ISOLINES A. Right-Click on the isoline layer and select Open Attribute Table i. When the table opens up, right-click on the heading named contour ii. Select Sort Ascending, this will sort the records from smallest to largest according to the values in the contour column iii. On the left-hand side left-click and hold the gray box next to the first desired value (again in this case 3.00) iv. Drag the cursor down to the last record of the desired vale, there may just be one record or there may be a hundred records to select. v. Then close the Attribute Table to view the selection(s) on the map B. View the selected isolines on the map to make sure that the ones selected are the ones desired VII. SELECTING PREFERED ISOLINES A. Depending on the burn plan, particular isolines are needed and others are not. Based on the selection from the previous step (Step VI), and depending on the needs of the burn plan only some of the selected isolines may be needed to proceed B. The main isolines needed for this study are located in the lowlands below the canyon walls.

47

Texas Tech University, Matthew Crawford, May 2008 i. To select the needed isoline, use the Selection button in the main toolbar ii. Select the isoline(s) by dragging a small box over part of the isoline(s), if more than 1 isoline needs to be selected, hold the shift key while selecting the other isolines C. Once the desired selection has been made, the isoline(s) are ready to be exported VIII. EXPORT PREFERED ISOLINES AND MAKE FIELD MAP A. While the desired isolines are still selected, they are ready to be exported i. Right-click on the isoline layer in the Table of Contents and select Data>>Export Data. . . 1. The default of the Export: dropdown box at the top of the interface should read Selected Features, if it does not, make sure that it does 2. Leave the coordinate system as the default same as the layers data source 3. Browse to save the exported file wherever desired. 4. Click OK (This exports the selected isoline(s) as a shapefile) ii. Select Yes to add the data to the map B. The map is now ready to prepared for the field to be checked for accuracy and usability i. Add the preferred isolines to the map 48

Texas Tech University, Matthew Crawford, May 2008 ii. Add the corresponding county Hillshade to the map iii. Add the corresponding county DOQQ to the map iv. Uncheck or Remove unneeded layers from map; such as countywide TRI layer, Isolines layer that were not preferred for final decision, and if desired remove the clipped TRI layer v. Zoom into the boundary layer C. Arrange layers in the Table of Contents as follows and symbolize i. Boundary Layer on top ii. DOQQ in the middle 1. Go to the Properties and select the Display Tab a. Set the Brightness level to 15% b. Set the Transparency level to 30% iii. Hillshade on the bottom, 1. Set the Resampling box to Bilinear Interpolation 2. Leave the other settings as the default values D. Getting map ready for the field i. Select the Layout button (this is similar to print preview) ii. Select Properties of the DOQQ layer 1. Adjust the map area to the print margins on the layout 2. Select the insert tab in the main menu of ArcMap a. Insert scale bar and set to desired specifications b. Insert north arrow and set properties as desired c. Insert title and labels as needed 49

Texas Tech University, Matthew Crawford, May 2008 iii. While in the field, make any adjustments to the isoline(s), and notes can be made to the map if needed to comply with the burn plan. IX. EXPOTING FINAL FIRE LINES TO GPS RECIEVER A. Open ArcMap B. Make changes to isolines as decided while in the field i. Click the Editor button ii. Select the isolines that need to be changed 1. Isolines can be redrawn by moving, adding, or removing the nodes connected to the contour lines 2. Click save edits 3. Click stop editing C. Open Trimbles Pathfinder Office i. Select Utilities, then Import 1. Select file to import by browsing to the revised isoline(s) 2. Select the output file by browsing to the working directory 3. Name file as desired 4. Select Properties a. Select the Coordinate System Tab to match the coordinate system of the imported file i. Change System to UTM ii. Change Zone to 14 North iii. Change Datum to NAD 83 (Conus) 50

Texas Tech University, Matthew Crawford, May 2008 b. Leave all other options as default values 5. Select OK ii. Plug the GPS unit into the power outlet iii. Plug the GPS unit into the computer 1. ActiveSync will confirm the unit is connected correctly 2. Select Utilities, then Data Transfer a. Select the send tab

b. Select the add button c. Browse to the imported GPS file on the computer d. Select OK e. Select the Transfer All button 3. The isoline data is now on the GPS receiver and ready for the field.

51

Texas Tech University, Matthew Crawford, May 2008 APPENDIX B TRI.AML /*------------------------------------------------------------------------------------/* USDA Forest Service - Rocky Mountain Research Station FSL. Moscow, ID /*------------------------------------------------------------------------------------/* Program: TRI.AML /* Purpose: Caculates Topographic Ruggedness Index /* /*------------------------------------------------------------------------------------/* Usage: RUGGEDNESS <DEM> <OUTGRID> {CLASSIFY} /* /* Arguments: DEM - Digital Elevtion Model /* TRI - Output name of final TWI grid /* CLASSIFY - Optional categorial reclass to the values below. /*------------------------------------------------------------------------------------/* Notes: Topographic Ruggedness Index /* The topographic ruggedness index (TRI) is a measurement developed by Riley, et al. (1999) /* to express the amount of elevation difference between adjacent cells of a digital elevation /* grid. The process essentially calculates the difference in elevation values from a center /* cell and the eight cells immediately surrounding it. Then it squares each of the eight /* elevation difference values to make them all positive and averages the squares. The topographic /* ruggedness index is then derived by taking the square root of this average, and corresponds /* to average elevation change between any point on a grid and its surrounding area. The authors /* of the TRI propose the following breakdown for the values obtained for the index where: /* /* 0-80 m is considered to represent a level terrain surface (1) /* 81-116 m represents nearly level surface (2) /* 117-161 m a slightly rugged surface (3) /* 162-239 m an intermediately rugged surface (4) /* 240-497 m a moderately rugged (5) 52

Texas Tech University, Matthew Crawford, May 2008 /* 498-958 m a highly rugged (6) /* 959-5000 m an extremely rugged surface. (7) /* (reclass values) /* /* If you would like to retain the grid with the original standard differences /* save %tmp1% grid. /* /*------------------------------------------------------------------------------------/* Required Input: <Digital Elevation Model> <OUTGRID> /* Optional Input (Standard Elevation Difference) /* Output: <Topographic Ruggedness Index> /* /*------------------------------------------------------------------------------------/* History: Jeffrey Evans - Landscape Ecologist /* 06/12/03 - Original coding /* 1221 South Main, Moscow, ID 83843 /* (208) 882-3557 /* jevans02@fs.fed.us /*========================================================= ============================= /* References: /* /* Riley, S. J., S. D. DeGloria and R. Elliot (1999). A terrain ruggedness index /* that quantifies topographic heterogeneity, Intermountain Journal of Sciences, /* vol. 5, No. 1-4, 1999. /* /* Blaszczynski, Jacek S., 1997. Landform characterization with Geographic Information /* Systems, Photogrammetric Enginnering and Remote Sensing, vol. 63, no. 2, February /* 1997, pp. 183-191. /* /*========================================================= ============================= /* Check Arguments /*========================================================= ============================= &args dem outgrid class &if [show PROGRAM] <> GRID &then &do grid 53

Texas Tech University, Matthew Crawford, May 2008 &type Can Only Be run From GRID &type Starting GRID &type Please re-run RUGGEDNESS &end &if [NULL %dem%] = .TRUE. &then &return &inform Usage: TRI <DEM> <TRI> {CLASSIFY} &if [NULL %outgrid%] = .TRUE. &then &return &inform Usage: TRI <DEM> <TRI> {CLASSIFY} &if [exists %dem% -grid] = .FALSE. &then &return &inform Grid [upcase %dem%] does not exist! &if [exists %outgrid% -grid] = .TRUE. &then &return &inform Grid [upcase %outgrid%] already exist! /*========================================================= ============================= /* MAIN /*========================================================= ============================= &s tmp1 [scratchname -prefix xx1] setwindow %dem% %dem% setcell %dem% &type /& Caculating Standard Elevation Differences /& DOCELL %tmp1% = ( ( sqr ( %dem%(0,0) - %dem%(-1,-1) ) ) ~ + ( sqr ( %dem%(0,0) - %dem%(0,-1) ) ) ~ + ( sqr ( %dem%(0,0) - %dem%(1,-1) ) ) ~ + ( sqr ( %dem%(0,0) - %dem%(1,0) ) ) ~ + ( sqr ( %dem%(0,0) - %dem%(1,1) ) ) ~ + ( sqr ( %dem%(0,0) - %dem%(0,1) ) ) ~ + ( sqr ( %dem%(0,0) - %dem%(-1,1) ) ) ~ + ( sqr ( %dem%(0,0) - %dem%(-1,0) ) ) ) END /*========================================================= ============================= /* Cleaning Up /*========================================================= =============================

54

Texas Tech University, Matthew Crawford, May 2008 &if [NULL %class%] = .TRUE. &then &do &type /& Caculating the Topographic Ruggedness Index /& %outgrid% = sqrt(%tmp1%) &messages &off kill %tmp1% all &messages &on &type /& Topographic Ruggedness Index GRID written to [upcase %outgrid%] /& &end /*========================================================= ============================= /* Reclassifying /*========================================================= ============================= &else &do &s tmp2 [scratchname -prefix xx2] &type /& Reclassifying Topographic Ruggedness Index /& &type 0-80 &type 81-116 &type 117-161 &type 162-239 surface. &type 240-497 &type 498-958 &type >959 /& (1) (2) (3) (4) Represents Represents Represents Represents a level terrain surface. nearly level surface. a slightly rugged surface. an intermediately rugged

(5) Represents a moderately rugged surface. (6) Represents a highly rugged surface. (7) Represents an extremely rugged surface.

%outgrid% = sqrt( %tmp1% ) * 10 %tmp2% = int ( sqrt( %tmp1% ) * 10 ) %class% = con(%tmp2% %tmp2% %tmp2% %tmp2% %tmp2% %tmp2% >= >= >= >= >= >= 0 && %tmp2% <= 80, 1, 81 && %tmp2% <= 116, 117 && %tmp2% <= 161, 162 && %tmp2% <= 239, 240 && %tmp2% <= 497, 498 && %tmp2% <= 958, 55 ~ 2, 3, 4, 5, 6,

~ ~ ~ ~ ~

Texas Tech University, Matthew Crawford, May 2008 %tmp2% >= 959, 7) ARC TABLES ADDITEM %class%.vat TRI 50 50 C SEL %class%.vat RESEL value = 1 move '0-80 Level terrain surface' to TRI ASEL RESEL value = 2 move '81-116 Nearly level surface' to TRI ASEL RESEL value = 3 move '117-161 Slightly rugged surface' to TRI ASEL RESEL value = 4 move '162-239 Intermediately rugged surface' to TRI ASEL RESEL value = 5 move '240-497 Moderately rugged surface' to TRI ASEL RESEL value = 6 move '498-958 Highly rugged surface' to TRI ASEL RESEL value = 7 move '>959 Extremely rugged surface' to TRI QUIT &messages &off kill (!%tmp1% %tmp2%!) all &messages &on &type /& Topographic Ruggedness Index GRID written to [upcase %outgrid%] /& &type /& Classified Topographic Ruggedness Index GRID written to [upcase %class%] /& &end 56

Texas Tech University, Matthew Crawford, May 2008 APPENDIX C TRI.VB Sub TRI() 'sInPath: path of the input raster dataset 'sInRasDSName: name of the input raster dataset 'sOutPath: path of the output raster dataset 'sOutRasDSName: name of the output raster dataset'Open input raster dataset Dim sInPath As String sInPath = "Z:\MattCrawfordTerrainRoughness\TRIWorking" Dim sInRasDSName As String sInRasDSName = "armstrong_dem" Dim sOutPath As String sOutPath = "Z:\MattCrawfordTerrainRoughness\TRIWorking" Dim sOutRasDSName As String sOutRasDSName = "armstrong_tri" Dim pRWS As IRasterWorkspace2 Dim pWSF As IWorkspaceFactory Set pWSF = New RasterWorkspaceFactory If Not pWSF.IsWorkspace(sInPath) Then Exit Sub End If Set pRWS = pWSF.OpenFromFile(sInPath, 0) Dim pInRasDS As IRasterDataset Set pInRasDS = pRWS.OpenRasterDataset(sInRasDSName) Get Dim Dim Set Set the forst band from the input raster pInBand As IRasterBand pInputBandCol As IRasterBandCollection pInputBandCol = pInRasDS pInBand = pInputBandCol.Item(0)

'QI raster properties Dim pInRasProps As IRasterProps Set pInRasProps = pInBand 'Create a default raster from input raster dataset Dim pInputRaster As IRaster Set pInputRaster = pInRasDS.CreateDefaultRaster

57

Texas Tech University, Matthew Crawford, May 2008

'Create output raster dataset Dim fsObj, fl Set fsObj = CreateObject("Scripting.FileSystemObject") If fsObj.FolderExists(sOutPath) Then Set fl = fsObj.GetFolder(sOutPath) If fl.Attributes And 1 Then Exit Sub Else Exit Sub End If Set pRWS = pWSF.OpenFromFile(sOutPath, 0) Dim pOrigin Set pOrigin pOrigin.X = pOrigin.Y = As IPoint = New Point pInRasProps.Extent.XMin pInRasProps.Extent.YMin

'Find out the pixel type of the input raster dataset Dim outPixelType As rstPixelType Select Case pInRasProps.PixelType Case PT_CHAR outPixelType = PT_LONG Case PT_COMPLEX MsgBox "Operation not supported on complex data", vbOKOnly, "Terrain Roughness Index" Exit Sub Case PT_DCOMPLEX MsgBox "Operation not supported on complex data", vbOKOnly, "Terrain Roughness Index" Exit Sub Case PT_DOUBLE outPixelType = PT_FLOAT Case PT_FLOAT outPixelType = PT_FLOAT Case PT_LONG outPixelType = PT_LONG Case PT_SHORT outPixelType = PT_LONG Case PT_U1 outPixelType = PT_LONG Case PT_U2 outPixelType = PT_LONG Case PT_U4 outPixelType = PT_LONG 58

Texas Tech University, Matthew Crawford, May 2008 Case PT_UCHAR outPixelType = PT_LONG Case PT_ULONG outPixelType = PT_LONG Case PT_USHORT outPixelType = PT_LONG End Select Dim pOutRasDS As IRasterDataset Set pOutRasDS = pRWS.CreateRasterDataset(sOutRasDSName, "GRID", pOrigin, _ pInRasProps.Width, pInRasProps.Height, pInRasProps.MeanCellSize.X, _ pInRasProps.MeanCellSize.Y, 1, outPixelType, pInRasProps.SpatialReference, True) 'Create a default raster from the output raster dataset Dim pOutRaster As IRaster Set pOutRaster = pOutRasDS.CreateDefaultRaster 'Get the first band from the output raster Dim pOutBand As IRasterBand Dim pOutputBandCol As IRasterBandCollection Set pOutputBandCol = pOutRasDS Set pOutBand = pOutputBandCol.Item(0) 'QI IRawPixels interface for input Dim pInRawPixels As IRawPixels Set pInRawPixels = pInBand 'QI IRawPixels interface for output Dim pOutRawPixel As IRawPixels Set pOutRawPixel = pOutBand 'Create a DblPnt to hold the PixelBlock size Dim pPnt As IPnt Set pPnt = New DblPnt pPnt.SetCoords pInRasProps.Width, pInRasProps.Height 'Creates PixelBlock from input Dim pInPixelBlock As IPixelBlock3 Set pInPixelBlock = pInRawPixels.CreatePixelBlock(pPnt) 'Creates PixelBlock from output Dim pOutPixelBlock As IPixelBlock3 Set pOutPixelBlock = pOutRawPixel.CreatePixelBlock(pPnt)

59

Texas Tech University, Matthew Crawford, May 2008

'Read input PixelBlock pPnt.X = 0 pPnt.Y = 0 pInRawPixels.Read pPnt, pInPixelBlock 'Get the SafeArray associated with the first band of the output Dim vSafeArray As Variant vSafeArray = pOutPixelBlock.PixelDataByRef(0) 'QI RasterProps for output for NoData handling Dim pOutRasProps As IRasterProps Set pOutRasProps = pOutBand 'Loop through the SafeArray and calculate each pixel value according to the neighborhood-notation Dim OutputPixelValue Dim i, j As Long For i = 0 To pInRasProps.Width - 1 'MsgBox "i: " & i For j = 0 To pInRasProps.Height - 1 'MsgBox "j: " & j If (i - 1) >= 0 And (i + 1) <= pInRasProps.Width - 1 And (j - 1) >= 0 And (j + 1) <= pInRasProps.Height - 1 Then If (Not CDbl(pInPixelBlock.GetVal(0, i, j)) = CDbl(pInRasProps.NoDataValue)) And _ (Not CDbl(pInPixelBlock.GetVal(0, i - 1, j 1)) = CDbl(pInRasProps.NoDataValue)) And _ (Not CDbl(pInPixelBlock.GetVal(0, i, j - 1)) = CDbl(pInRasProps.NoDataValue)) And _ (Not CDbl(pInPixelBlock.GetVal(0, i + 1, j 1)) = CDbl(pInRasProps.NoDataValue)) And _ (Not CDbl(pInPixelBlock.GetVal(0, i + 1, j)) = CDbl(pInRasProps.NoDataValue)) And _ (Not CDbl(pInPixelBlock.GetVal(0, i + 1, j + 1)) = CDbl(pInRasProps.NoDataValue)) And _ (Not CDbl(pInPixelBlock.GetVal(0, i, j + 1)) = CDbl(pInRasProps.NoDataValue)) And _ (Not CDbl(pInPixelBlock.GetVal(0, i - 1, j + 1)) = CDbl(pInRasProps.NoDataValue)) And _ (Not CDbl(pInPixelBlock.GetVal(0, i - 1, j)) = CDbl(pInRasProps.NoDataValue)) Then

60

Texas Tech University, Matthew Crawford, May 2008 'vSafeArray(i, j) = CDbl(pInPixelBlock.GetVal(0, i, j)) + (CDbl(pInPixelBlock.GetVal(0, i - 1, j - 1)) CDbl(pInPixelBlock.GetVal(0, i + 1, j + 1))) OutputPixelValue = (CDbl(pInPixelBlock.GetVal(0, i, j)) CDbl(pInPixelBlock.GetVal(0, i - 1, j - 1))) ^ 2 + _ (CDbl(pInPixelBlock.GetVal(0, i, j)) CDbl(pInPixelBlock.GetVal(0, i, j - 1))) ^ 2 + _ (CDbl(pInPixelBlock.GetVal(0, i, j)) CDbl(pInPixelBlock.GetVal(0, i + 1, j - 1))) ^ 2 + _ (CDbl(pInPixelBlock.GetVal(0, i, j)) CDbl(pInPixelBlock.GetVal(0, i + 1, j))) ^ 2 + _ (CDbl(pInPixelBlock.GetVal(0, i, j)) CDbl(pInPixelBlock.GetVal(0, i + 1, j + 1))) ^ 2 + _ (CDbl(pInPixelBlock.GetVal(0, i, j)) CDbl(pInPixelBlock.GetVal(0, i, j + 1))) ^ 2 + _ (CDbl(pInPixelBlock.GetVal(0, i, j)) CDbl(pInPixelBlock.GetVal(0, i - 1, j + 1))) ^ 2 + _ (CDbl(pInPixelBlock.GetVal(0, i, j)) CDbl(pInPixelBlock.GetVal(0, i - 1, j))) ^ 2

OutputPixelValue = OutputPixelValue ^ 2 If OutputPixelValue > 5000 Then OutputPixelValue = 5000 End If

Select Case OutputPixelValue Case 0 To 80 OutputPixelValue = 1 Case 81 To 116 OutputPixelValue = 2 Case 117 To 161 OutputPixelValue = 3 Case 162 To 239 OutputPixelValue = 4 Case 240 To 497 OutputPixelValue = 5 61

Texas Tech University, Matthew Crawford, May 2008 Case 498 To 958 OutputPixelValue = 6 Case 959 To 5000 OutputPixelValue = 7 Case Else OutputPixelValue = 0 End Select

vSafeArray(i, j) = OutputPixelValue

Else vSafeArray(i, j) = CDbl(pOutRasProps.NoDataValue) End If Else 'vSafeArray(i, i) = CDbl(pOutRasProps.NoDataValue) End If Next j Next i 'Write out the result pOutRawPixel.Write pPnt, pOutPixelBlock End Sub

62

PERMISSION TO COPY

In presenting this thesis in partial fulfillment of the requirements for a masters degree at Texas Tech University or Texas Tech University Health Sciences Center, I agree that the Library and my major department shall make it freely available for research purposes. Permission to copy this thesis for scholarly purposes may be granted by the Director of the Library or my major professor. It is understood that any copying or publication of this thesis for financial gain shall not be allowed without my further written permission and that any user may be liable for copyright infringement.

Agree (Permission is granted.)

Matthew Allan Crawford


Student Signature

05/01/2008
Date

Disagree (Permission is not granted.)

_______________________________________________ Student Signature

_________________ Date

You might also like