Abstract

The use of unmanned aerial systems (UAS)-based remote sensing methods in precision agriculture (PA) has seen rapid development in recent years. These technologies are expected to revolutionize crop management by capturing imagery data with a high spatial, temporal, and spectral resolution, thereby enabling the decision-making of farm inputs at the sub-field level and on an almost daily basis. However, in real-world operational applications, the potential of UAS-based remote sensing methods has not yet been fully exploited. One of the main research avenues is that of structural characterization of crops in order to assess plant density, leaf density, i.e., overall crop health, and ultimately, crop yield. Using a UAS-based imagery system, we concurrently collected multi-source imagery data. We used structure-from-motion (SfM; photogrammetry) and light detection and ranging (LiDAR) point clouds to observe snap bean fields across two years. We hypothesized that the 3D point clouds represent essential structural information of the crop and that by extracting various features from the oversampled (dense) 3D data, we could retrieve critical structural characteristics of the crops and eventually relate them to high-level objectives, including disease risk and yield modeling. We further explored the effectiveness of feature-level data fusion between LiDAR point clouds and multispectral imagery, coupled with machine learning algorithms, for yield modeling and disease detection applications. We found that both SfM and LiDAR point clouds achieved similar high accuracies for assessment of crop height (CH) and row width (RW) (RMSE of ~0.02 m for CH and ~0.05 m for RW). For measuring the leaf area index (LAI), the LiDAR-derived models achieved the highest accuracy (𝑅²= 0.61, n𝑅𝑀𝑆𝐸 = 19%), while the SfM-derived models exhibited slightly lower values with a predicted π‘…Β²β‰ˆ0.5 and n𝑅𝑀𝑆𝐸 β‰ˆ22%. We found that the fusion of LiDAR and MSI data yielded good results for prediction of the snap bean yield, with an 𝐴𝑑𝑗. 𝑅² = 0.827 and n𝑅𝑀𝑆𝐸 = 9.4%. This work demonstrated the potential of 3D point cloud data in PA applications and the performance of a UAS-based remote sensing system in monitoring short broadacre crops, such as snap bean.

Publication Date

10-27-2022

Document Type

Dissertation

Student Type

Graduate

Degree Name

Imaging Science (Ph.D.)

Department, Program, or Center

Chester F. Carlson Center for Imaging Science (COS)

Advisor

David Ross

Advisor/Committee Member

Jan Van Aardt

Advisor/Committee Member

Carl Salvaggio

Comments

This dissertation has been embargoed. The full-text will be available on or around 11/30/2023.

Campus

RIT – Main Campus

Available for download on Thursday, November 30, 2023

Share

COinS