Remote sensing has long been used to monitor vegetation health and to detect change in rangeland ecosystems. Recently, small unmanned aerial systems (sUAS) have provided a tremendous improvement in image resolution and the ability to discriminate surface characteristics. �The purpose of this research is to compare the effectiveness of object-based versus spectral-based classification in distinguishing vegetation (species, total cover), percent bare ground, litter, and rock using very high resolution imagery acquired from sUAS (drones). Images were obtained from sagebrush and annual grasslands in central Nevada (west of Elko). Flight missions were flown 100ft above ground level using automated flight paths, and individual images were processed into orthomosaics using Pix4D software. Features were classified using either spectral (supervised, maximum likelihood) classification or with eCognition (object based classification). Ground-based measurements were collected in the field to compare rangeland structure with output from either classification technique. Results indicate that very high resolution imagery can be effectively used to assess rangeland ecosystems that can aid in rangeland assessment and monitoring.
Oral presentation and poster titles, abstracts, and authors from the Society for Range Management (SRM) Annual Meetings and Tradeshows, from 2013 forward.