DEEP PHENOTYPING CONSIDERING TILE DRAINAGE FROM UAS-BASED MULTISPECTRAL IMAGERY BY CONVOLUTIONAL NEURAL NETWORKS

Subsurface agriculture tile lines can greatly impact plant phenotypic characteristics through spatial variation of soil moisture, plant nutrient, and plant rooting depth. Therefore, location of subsurface tile lines plays a critical role in supporting the above ground plant phentoyping and needs to be considered in plant phenotyping analysis. Unnamed Aerial Systems (UAS) imagery together with deep learning methods can develop strong relations between the vegetation spectra and soil parameters. Here, we consider the capability of deep convolutional neural networks (CNN) to evaluate crop quality based on biomass production derived from soil moisture differences by using UAS-based multispectral imagery over soybean breeding fields. Results are still being evaluated, with particular attention to the temporal and spatial resolution of the data required to apply our approach.


INTRODUCTION
Recent advances in sensor technology have created great opportunities for UAS (Unnamed Aerial Systems) as a low-cost platform to derive high throughput and precise quantitative phenotyping datasets (Araus et al., 2018). Regarding precision farming, UAS have the ability to efficiently capture high spectral, temporal and spatial resolution data (Holman et al., 2016). These is the greatest advantages on contrast with satellite or airborne platforms where the temporal resolution can not be adjusted so easy and the spatial resolution is bigger. In terms of multispectral LiDAR (Light Detection and Ranging), the lower cost of the photogrammetry plays the key factor as well as the lighter weight of the sensor to be onboard. Another vital characteristic in the suitability of close-range remote sensing for vegetation analysis consisting on non-destructive and non-invasive procedures, providing similar accuracy to destructive field methods (Herrero-Huerta et al., 2018). UAS and powerful image analysis algorithms allow plant breeders to measure phenotypic variability; thus, soil properties can be estimated with effective image pre-processing. One significant property is the soil moisture, that is clearly affected by subsurface tile lines (tile drainage pipes). These tile lines are extensively installed in agriculture fields of the Midwestern U.S. to remove excessive surface water and are made of clay, concrete or plastic pipes. Tile drainage also facilitate early access to the farmland for conducting timely field operations.

* Corresponding author
Deep learning methods perform more advanced statistical techniques to phenotyping qualitative states than establishing regression algorithms of using vegetation index for this purpose (Gholizadeh and Rahman, 2015). This means that they have the representational capacity to learn complex models of plant phenotypes (Ubbens and Stavess, 2017). However, the robustness depends on the quantity and quality of the training data. Of the deep learning algorithms, CNN is often employed to find patterns and where the input data covers local connectedness; such as spatially local features in images (Ubbens and Stavess, 2017). Thereby, the goal of this study is to generate an approach to assess the crop quality measured by biomass production derived from soil moisture variations based on the tile line location in the field from UAS imagery by CNN.

MATERIALS AND METHODS
The equipment used for the data acquisition is listed below: A four narrowband passive sensor: Parrot Sequoia Multispectral sensor with the incorporated Sunshine sensor to radiometrically calibrate the images. The camera specifications are defined in Table 1. The filter wavelengths are particularly adjusted for the evaluation of the specific behavior of the vegetation, avoiding areas of atmospheric absorption. Table 2 shows the channel specifications. A general purpose GER 1500 spectroradiometer to acquire spectral measurements of the calibration targets. The main technical specifications of the spectroradiometer are shown in Table 3. The senseFly eBee, designed as a fixed wing UAS for application in precision agriculture with incorporated GPS, IMU and magnetometer. It has a weight of 700 g and a payload of 150 g. The multispectral camera is controlled by the senseFly eBee autopilot during the flight (Figure 1).
AGB samples were destructively conducted the day after the UAS flight, by cutting soybean stems from 1m of row length in each of two neighboring rows per plot, roughly 2 cm above the ground surface. These samples were processed in a drying oven at 60.0 °C until weights stabilized, weighed and pondered by plot.

Figure 1. UAS platform: senseFly eBee
The ground truth data of the location of the tile lines was performed by (Rahmani et Schulze, 2020). Aerial imagery was used to manually locate tile lines based on the spectral differences of wet and dry soil: soil over tile lines dries faster than in between tiles; this causes a higher reflectance in visible and near infrared regions of the electromagnetic spectrum. Next, the tile probe method was carried out to conduct ground validation of the mapped tile lines. Since the diameter of the pipes are 10 cm, the ground was probed every 7 cm. After sensing the identification of the tile lines, locations were recorded with real time kinematic GPS. On average, tile lines were predicted within ±1.23 m spatial accuracy (Rahmani et Schulze, 2020). As a methodology for the photogrammetric processing, the Pix4Dmapper software package (Pix4D SA, Lausanne, Switzerland) was employed, obtained georeferenced and calibrated orthomosaics per band. Next, a deep learning image analysis was carried out by Convolutional Neural Networks using Tensor Flow and Learn Tool library, coded in Python. CNN architecture is designed by layers with discrete trainable parameter settings. The output of each layer was passed through some non-linear activations such as Sigmoid or Relu functions. Finally, it outputs a score that represents the semantic class label of the input data (Namin et al., 2018). In our application, we fed the CNN with pixel values per band of multispectral imagery, weighted by the Euclidean distance to the underground tiles. The 'label' data used to train the network is the biomass measured per plot, directly related with the crop quality. The testing rate is 20%. The goal is to quantify the crop quality affected by soil moisture differences based on the tile line location within the study field.

EXPERIMENTAL RESULTS AND DISCUSSION
The experiment was performed at the Agronomy Center for Research and Education (ACRE) during 2018 growing season in West-Lafayette (Indiana, USA). The study area has an extension of 282.4*109.5 m 2 consisting on 900 plots with 6 rows each. The flight was performed via autonomous flying mode on July 23 rd (DAP 61), with a 75% overlap and an altitude of 65 m. 6 GCPs were measured with GNSS, using RTKNAVI software (Tajasu, 2009). The tiles in the study site are of concert type. Figure 1 shows the study field with tile lines overlaid in red.
The multispectral data captured is illustrated in Figure  2.   One issue to overcame is that deep learning models are susceptible to overfitting in the case of small datasets or large datasets with an insufficient level of variation (Krizheysky et al., 2012).   To robustly analysis the model, different statistic metrics were calculated, such as the coefficient of determination (R 2 ), the root mean square error (RMSE), the relative RMSE (RRMSE), the average systematic error (ASE) and the mean percent standard error (MPSE). These metrics were subtracted as follow: where x r i is the roughness of the i th plot, x AGB i is the measured AGB within the i th plot, is the mean of the measured AGB, and n is the number of plots in the testing dataset. Table 4 shows reached error metrics. At this point, results are still being evaluated with particular attention to the temporal and spatial resolution of the data required to apply our approach.

OUTLOOK
In this study, we generate an approach to model the biomass considering the soil moisture differences from UAS-based multispectral imagery over soybean breeding fields by Convolutional Neural Networks. It gives a significant indication that soil moisture needs to be considered in crop genetic analysis. At the same time, the potential of UAS in phenotyping analysis is clearly proven, specifically the power of high spatial, temporal and spectral imagery as low-cost and reliable data. Additionally, more comprehensive studies are necessary, including studies at different dates during the growing season for soybean. A comparison among different species of crop and quantifying how the tile lines affect the biomass production will be address as a future work.
The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XLIII-B3-2020, 2020 XXIV ISPRS Congress (2020 edition)