remote sensing Article Banana Mapping in Heterogenous Smallholder Farming Systems Using High-Resolution Remote Sensing Imagery and Machine Learning Models with Implications for Banana Bunchy Top Disease Surveillance Tunrayo R. Alabi 1 , Julius Adewopo 2 , Ojo Patrick Duke 3 and P. Lava Kumar 1,* 1 International Institute of Tropical Agriculture (IITA), Oyo Road, Ibadan PMB 5320, Nigeria 2 International Institute of Tropical Agriculture (IITA), Kacyiru, Kigali P.O. Box 1269, Rwanda 3 Department of Natural and Applied Sciences, TERI School of Advanced Studies, New Delhi 110070, India * Correspondence: l.kumar@cgiar.org Abstract: Banana (and plantain, Musa spp.), in sub-Saharan Africa (SSA), is predominantly grown as a mixed crop by smallholder farmers in backyards and small farmlands, typically ranging from 0.2 ha to 3 ha. The crop is affected by several pests and diseases, including the invasive banana bunchy top virus (BBTV, genus Babuvirus), which is emerging as a major threat to banana production in SSA. The BBTV outbreak in West Africa was first recorded in the Benin Republic in 2010 and has spread to the adjoining territories of Nigeria and Togo. Regular surveillance, conducted as part of the containment efforts, requires the identification of banana fields for disease assessment. However, small and Citation: Alabi, T.R.; Adewopo, J.; fragmented production spread across large areas poses complications for identifying all banana farms Duke, O.P.; Kumar, P.L. Banana using conventional field survey methods, which is also time-consuming and expensive. In this study, Mapping in Heterogenous we developed a remote sensing approach and machine learning (ML) models that can be used to Smallholder Farming Systems Using identify banana fields for targeted BBTV surveillance. We used medium-resolution synthetic aperture High-Resolution Remote Sensing radar (SAR), Sentinel 2A satellite imagery, and high-resolution RGB and multispectral aerial imagery Imagery and Machine Learning from an unmanned aerial vehicle (UAV) to develop an operational banana mapping framework Models with Implications for Banana by combining the UAV, SAR, and Sentinel 2A data with the Support Vector Machine (SVM) and Bunchy Top Disease Surveillance. Remote Sens. 2022, 14, 5206. Random Forest (RF) machine learning algorithms. The ML algorithms performed comparatively https://doi.org/10.3390/rs14205206 well in classifying the land cover, with a mean overall accuracy (OA) of about 93% and a Kappa coefficient (KC) of 0.89 for the UAV data. The model using fused SAR and Sentinel 2A data gave an Academic Editors: Anna Jarocińska, OA of 90% and KC of 0.86. The user accuracy (UA) and producer accuracy (PA) for the banana class Adriana Marcinkowska-Ochtyra and were 83% and 78%, respectively. The BBTV surveillance teams used the banana mapping framework Adrian Ochtyra to identify banana fields in the BBTV-affected southwest Ogun state of Nigeria, which helped in Received: 3 August 2022 detecting 17 sites with BBTV infection. These findings suggest that the prediction of banana and other Accepted: 13 October 2022 crops in the heterogeneous smallholder farming systems is feasible, with the precision necessary to Published: 18 October 2022 guide BBTV surveillance in large areas in SSA. Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in Keywords: Musa; banana; plantain; smallholder farms; remote sensing; drones; machine learning; published maps and institutional affil- disease surveillance; banana bunchy top virus; Africa iations. 1. Introduction Copyright: © 2022 by the authors. Banana (and plantain, Musa spp.) is an important staple food for nearly 100 million Licensee MDPI, Basel, Switzerland. people and supports rural livelihoods and food security in sub-Saharan Africa (SSA) [1]. This article is an open access article The crop is predominantly grown as a semi-perennial mixed crop along field boundaries distributed under the terms and conditions of the Creative Commons and backyards by smallholder farmers whose farmland holdings typically range from 0.2 ha Attribution (CC BY) license (https:// to 3 ha [2]. The FAO estimate suggests that the crop is cultivated across 6.7 million hectares creativecommons.org/licenses/by/ of land in SSA, representing 59.7% of the total global banana and plantain production 4.0/). area [3]. However, due to the low productivity of this 6.7 t/ha, the total annual production Remote Sens. 2022, 14, 5206. https://doi.org/10.3390/rs14205206 https://www.mdpi.com/journal/remotesensing Remote Sens. 2022, 14, 5206 2 of 22 of 48.5 million tonnes accounts for 29.8% of the global output [3]. Cultivation under subsistence farming conditions, low soil fertility, and pests and diseases have contributed to this low productivity. Out of many diseases affecting banana production, the bunchy top disease caused by the banana bunchy top virus (BBTV, genus Babuvirus) has emerged as a major con- straint on banana production in SSA [4]. Within the past decade, BBTV spread has been reported in at least six countries, with an outbreak in West Africa first reported in Benin in 2011 [5], in 2012 in Nigeria [6], and in 2018 in Togo [7]. Recently, BBTV has spread to East Africa, mainly Uganda [8] and Tanzania [9], indicating the further expansion of the virus across the continent. The required ground-level scouting and extensive surveillance required to identify BBTV-infected banana stands across a large area is a tall order due to the fragmented nature of the plantations, including in backyards, unmanaged habitats, abandoned plantations, and the fact that many farms are difficult to access due to a lack of roads. BBTV surveillance, under such conditions, requires local individuals to guide the survey teams to the banana fields. However, this approach has not only proven to be difficult and time-consuming, but it is also often marked by the unintentional omission of several banana farms, with implications for the representativeness of the incidence mapping and full remediation of the disease within the target geography. This suggests a critical need for alternative and reliable approaches to rapidly map banana lands in order to support efficient BBTV surveillance, such as the use of high-resolution satellite imagery and unmanned aerial vehicles (UAVs) for the mapping of banana fields prior to disease scouting. Satellite imagery has been used extensively in combination with machine learning (ML) models to map vegetation and crop types on the local and regional scales [10–13]. For instance, MODIS satellite data are available at a spatial resolution of 250 m and well suited for regional-scale mapping. However, Landsat and Sentinel 1 and Sentinel 2 data possess an intermediate resolution (10–30 m) and have been widely used for landscape mapping [13,14]. Commercial satellite technologies, such as IKONOS, QuickBird, and WorldView multispectral imagery, have been successfully utilized for more detailed crop mapping in precision agriculture [15,16]. The successful application of remote sensing products depends on the data charac- teristics and the type of analytics applied. There are diverse ML techniques, comprising conventional models, such as Random Forest (RF), Support Vector Machine (SVM), decision trees (DT), and k-nearest neighbors (KNN), as well as deep learning and neural networks models, such as convolution neural network (CNN) and multi-layer neural network (MLP). ML algorithms accept various input predictor data, without assumptions about the data distribution, in order to classify land cover types in remotely sensed imageries [1,17]. RF and SVM are the two most common models that have been applied for cropland mapping in various contexts due to their ability to handle high-dimensional spectral data for crop and landcover classification [18–21]. Several studies have applied ML combined with satellite and drone imagery when classifying crops [22,23]. For instance, ML algorithms and Sentinel 2A data were used to identify crops such as soybean, rice, and maize in the Jilin Province of China, with a high classification accuracy of >90% [20]. Multispectral Sentinel 2A data were used to classify crop types and accurately distinguish between crops on the landscape scale in India [24]. Airborne orthophotos, combined with object-based image analysis, were used to detect banana plants in order to aid in the BBTV eradication program in Australia [25]. A deep learning model algorithm was developed in order to train drone imagery acquired in the smallholder system of Rwanda to identify maize, beans, and bananas [26]. A combination of UAV and other optical satellite imagery were used to detect bananas for banana disease surveillance in Benin and the Democratic Republic of Congo (DRC) [27]. Recently, UAV- based multispectral data were used for the automated detection of individual banana plants in monocultures in Australia [12]. Remote Sens. 2022, 14, 5206 3 of 22 Although these studies have demonstrated the utility of employing UAV and satellite imagery data to map crops, limitations associated with the spatial resolution persist, espe- cially in the case of smallholders’ patchy farm sizes in heterogeneous landscapes [28]. In the banana-growing areas of West Africa, the operational use of optical satellite imagery is limited by cloudy weather conditions, which degrade the quality of the images [29,30]. Persistent cloud cover during the cropping season prevents the acquisition of useable optical imagery that captures the crop phenology for accurate mapping. On the other hand, the utility of synthetic aperture radar (SAR) is unlimited throughout the crop cycle. SAR imagery shows potential because it is not affected by weather and clouds and can discrimi- nate between crop structural and geometric features. Many studies have demonstrated an improved classification accuracy by integrating SAR imagery with optical data [30,31]. Yet, limited knowledge exists regarding the potential value of combining this technique with UAV-derived and Sentinel 2 imageries for land cover classification in complex smallholder farming systems. Therefore, this study was conducted in order to develop an operational banana map- ping framework by combining UAV, SAR, and Sentinel 2 imagery with RF and SVM analytics to identify bananas in heterogenous smallholder farming systems in SSA and to use the maps to guide rapid and efficient BBTV surveillance in SSA. 2. Materials and Methods 2.1. Study Area This research was carried out in an area of about 32,500 ha in four local government areas (LGA) of Ogun State, Nigeria, where BBTV occurrence was first recognized in 2012 [6] (Figure 1). The main trees and common arable crops cultivated in the region include cocoa, oil palm, oranges, maize, cassava, banana/plantain, cowpea, and vegetables. Farmers predominantly practice intercropping or mixed cropping, and monocropping is rare. The land in this area includes evergreen lowland forest and deciduous woodland, with pockets of agricultural land. The topography varies from nearly flat to moderately high slopes with a mean elevation of about 60 m above sea level (masl) and a mean gradient of 16%. The major soil types in this area are Lixisols, Nitisols, and Fluvisols [32]. The site is characterized by a sub-humid tropical climate with a mean annual rainfall of about 1200–1300 mm, an annual mean temperature ranging from 31.2 to 32.0 ◦C at the maximum, and a minimum temperature of 22.3 to 23.1 ◦C. The rain starts around March and continues until the end of October [33]. 2.2. UAV Data Acquisition A senseFly eBee X fixed-wing UAV (Sensefly, Cheseaux-Lausanne, Switzerland) was used to acquire ultra-high-resolution images of the seven geographic sites (Figure 1c). The UAV has a 116 cm wingspan weighing between 1.1 to 1.4 kg, fitted with a Parrot Sequoia RGB and multispectral camera combined with four radiometric self-calibrating sensors (green: 530–570 nm, red: 640–680 nm, red edge: 730–740 nm, and near-infrared: 770–810 nm), an integrated irradiance sensor (sunshine sensor) to synchronize the irra- diance values with the onboard GPS, an inertial measurement unit (IMU), and a mag- netometer. The ancillary real-time kinematic (RTK) positioning satellite navigation tool was activated to enhance the precision of the position data derived from the satellite po- sitioning systems during each UAV flight. The RTK activation achieved an accuracy of about 3 cm without ground control points (GCPs). Preliminary surveys were conducted in August 2020 and identified seven UAV flight mission sites based on the banana cultivation diversity and intensity (Table 1). Subsequent flight missions were conducted between 9–12 December 2020, which coincided with the end of the rainy season, when the fields contained the most arable crops that were mature for harvest. Of the seven UAV flight areas, the largest was the Olokuta site (390 ha) and the smallest was Ipaja Road (117 ha) (Table 1). RReemmoottee SSeennss. .22002222, ,1144, ,x5 2F0O6R PEER REVIEW  4 4ofo f2242     FFiigguurree 11.. ((aa)) GGeeooggrraapphhiicc llooccaattiioonno offt hthees tsutduydyar aeareian  iNni gNeirgiae.ri(ab.) (Mb)a pMoafpf oouf rfoloucra llogcoavle grnomveernntmareenats  a(rLeGasA (sL)GaAffesc) taefdfebctyedth beyb tahnea bnaanbaunnac bhuyntcohpyv tiorpu sv(iBruBsT (VB)BiTnVt)h ienO thgeu On gstuante s,tNatieg, eNriiag,ewriha,e wrehtehree athreea  auresead ufsoerdb faonr abnaanaannda aBnBdT VBBaTsVse sassmseesnsmt iesnint disi ciantdeidcabtyeda bbyla ac kblraecckta rnegcltea.n(gcl)eN. (act)u Nraaltucoralol rcocolomr pcoomsit‐e posite of the Sentinel 2A map of the study area with markings of the seven UAV flight sites (red  of the Sentinel 2A map of the study area with markings of the seven UAV flight sites (red outlines). outlines). (1) Okoeye, (2), Olujere, (3) Erimi, (4) Olokuta, (5) Ipaja Road, (6) Ipaja Town, and (7) Igbeji.  (1) Okoeye, (2), Olujere, (3) Erimi, (4) Olokuta, (5) Ipaja Road, (6) Ipaja Town, and (7) Igbeji. 2.2. UAV Data Acquisition  Table 1. Details of the UAV flight sites, land area covered, and images collected during flight missions. A senseFly eBee X fixed‐wing UAV (Sensefly, Cheseaux‐Lausanne, Switzerland) was  used to acquire ultra‐high‐resFolilguhttioAnr eima ages ofN thuem sbeevreonf geographic sites (Figure 1c). The Pixels of the Mosaiced MS UASV. hNaos. a 11F6l icgmht wSiitnegspan Cwoevigerhaigneg betwMeuelnti s1p.1ec ttora 1l .(4M kSg), fittedI mwaitghe sa( 1P2acrmro/tP Sixeeqlu) oia  RGB and multispectral camera (cHoam) bined wanitdh RfoGuBr Irmaadgioesmetric self‐calibrating sensors  (green1: 530–5O70k onemye, red: 640–68104 0n.7m, red edge: 753009–5740 nm, and near1‐i0n0f,2ra71re,6d8:0 770–810  nm), a2n integOraltuejedr eirradiance se2n3s4o.3r (sunshine sen66s2o0r) to synchronize t1h7e8 ,9ir1r8a,5d2i9ance val‐ ues w3ith the oEnrbimoiard GPS, an in1e3r2t.i5al measuremen5t2 2u5nit (IMU), and a m9a9g,2n8e5t,o0m96eter. The  4 Olokuta 390 11,160 298,709,636 ancilla5ry real‐ItpiamjaeR koiandematic (R1T1K7.)3 positioning sa4t1e2l5lite navigation too7l 5w,6a4s6 ,a0c2t4ivated to  enhan6ce the pIpreacjaisTioown nof the po2s4it3i.o9n data derived65 f4r0om the satellite po1s8i7t,i5o3n6i,n63g0 systems  during7 each UIgAbVeji flight. The RT1K92 a.9ctivation achie5v2e6d5  an accuracy of ab14o2u,5t 133 ,c6m53 without  ground contrTool tpaloints (GCPs).1 P45r1e.l5im0 inary surve4y4s,0 3w0ere conducted in1, 0A82u,g88u1s,t2 428020 and  identified seven UAV flight mission sites based on the banana cultivation diversity and  intensFiltiyg h(Ttapbalrea 1m).e Steurbssseiqmuielanrt tfoligthhot sme idsessiocnrisb ewdeirne Bcoönhdleurcetetda lb. [e3t4w]ewene r9e–a1d2o Dpetecdem, wbietrh  2l0a2te0r, awl ahnicdhl ocnoginitcuiddeinda wl oivthe rtlhape sesnedt aotf 6t0h%e raanidny8 0s%ea,sroensp, ewcthievnel yth, teo  feienlsdusr ecoonpttaiminaeldU  tAheV  mimosatg aeraobvleer clarpop. sT thheatr weseorleu mtioantupree rfopri xhealrvraesntg. eOdf tfhroem sev10enc mUAtVo  1fl5igchmt a, rweahsi,l ethteh learflgiegshtt  walatsit uthdee Oralonkguedtaf sriotme (7349.03 hma)/ aAnEdD thtoe 1s0m6a.1llems/t AwEasD I.pTahjae Rsaomade (s1ta1n7d haar)d (Tparobtloec 1o)l. was used for all the flight missions, and we took images of the physical radiometric targets before   Remote Sens. 2022, 14, 5206 5 of 22 each flight. The eMotion software version 3.16 (Sensefly, Cheseaux-Lausanne, Switzerland) was used for the flight planning and mission control. The UAV flew to pre-determined waypoints according to flight plans pre-programmed using the eMotion software (AgEagle Aerial Systems Inc., Wichita, KS, USA). The total area covered by the drone flight missions for the seven sites was about 1450 ha, and about 44,030 multispectral and RGB images were acquired during the various flight missions (Table 1). 2.3. UAV Image and Ancillary Data Processing All images collected during UAV flight missions were processed using the eMotion flight data manager. The optimized images were further processed using photogrammetric imagery processing software, Pix4D Mapper (Pix4D SA, Lausanne, Switzerland), for the image geo-tagging and correction of terrain and platform distortions. The irradiance values obtained from the sunshine sensor were used to generate orthomosaics of the reflectance data [16]. The reflectance bands produced were mosaiced at a spatial resolution of 12 cm. A similar procedure was used to create ortho-rectified RGB mosaic images at a 2.5 to 3.5 cm spatial resolution. The digital terrain model (DTM) and digital surface model (DSM) were produced as ancillary data based on the structure of the motion point cloud (SfM) by the Pix4D Mapper (Pix4D S.A., Prilly, Switzerland). The difference between the DSM and DTM was used to obtain the height raster of the above-ground object, which was used to discriminate between objects such as trees, buildings, and crops [35]. 2.4. Sentinel 1 Image and Preprocessing The eight C-band data (wavelength ~6 cm), formatted as interferometric wide (IW)- swath single and dual polarization images acquired in all light and weather conditions by the European Space Agency Copernicus program Sentinel 1 SAR satellites 1A and 1B between June and December 2020, were downloaded from the source website (https: //scihub.copernicus.eu/dhus/#/home (accessed on 25 March 2021)), and the ground range detected (GRD) products, level-1C images, were used (Table 2). We selected eight corre- sponding SAR images of the study site of the banana-growing area at an incidence angle (0) of about 30.9–46 and processed each image using the Sentinel Application Platform (SNAP) software version 8. First, we conducted the precise orbit determination (POD) using the orbit file and then conducted the terrain correction using a subset of the image within the extent of the study area. After removing the thermal noise, radiometric calibration, geomet- ric correction, and speckle filtering were performed [30,36]. The final image was converted from linear to dB (logarithmic) format and exported to Geotiff format for further analysis. Table 2. Details of Sentinel 1, SAR, and Sentinel 2A data acquired between June and December 2020. Sentinel 1 (SAR) Sentinel 2A Parameters Dates of Spatial Acquisitions Band ID Resolution (m) Azimuth 10 11 June 2020 1 (Coastal)—0.443 µm 60 resolution Polarization Dual 17 July 2020 2 (Blue)—0.490 µm 10 (VV-VH) Mode IW 22 August 2020 3 (Green)—0.560 µm 10 Incidence angle ascending 15 September 2020 4 (Red)—0.665 µm 10 30.9–46 27 September 2020 5 (Red Edge)—0.705 20 µm 21 October 2020 6 (Red Edge)—0.740 20 µm 8 December 2020 7 (Red Edge)—0.783 20 µm 8 (NIR)—0.842 µm 10 8A (NIR)—0.865 µm 20 9 (Water)—0.940 µm 60 10 (SWIR)—1.357 µm 60 11 (SWIR)—1.610 µm 20 12 (SWIR)—2.190 µm 20 VV-VH = Vertical transmit and vertical receive and horizontal transmit and vertical receive. IW = interferometric wide; NIR = near-infrared; SWIR = short-wave infrared band. Remote Sens. 2022, 14, 5206 6 of 22 2.5. Sentinel 2 Image and Preprocessing The Sentinel 2A (S2A) images accessed were captured on 26 December 2020, as this was the closest cloud-free temporal match with the UAV data obtained from 9–12 December 2020. Sentinel 2A (S2A) is one of the two polar-orbiting satellites in the same sun-synchronous orbit, phased at 180◦ to each other. The S2A has 13 spectral bands at different spatial resolutions (10, 20, and 60 m) (Table 2). Ten bands were used for the subsequent analysis, excluding bands 1, 9, and 10 for aerosol, water vapor, and cloud monitoring. Using the Sen2Cor toolbox in SNAP, we performed an atmospheric correction and used resampled bands to obtain a 10 m/pixel resolution using the bilinear interpolation method. 2.6. Processing of Vegetation Indices Vegetation indices (VIs) are crucial in image analysis for crop identification and land cover classification [34–36]. VIs provides additional information from original spectral reflectance bands to discriminate better vegetative and land cover types, including land- cover characteristics such as soil brightness, crop stress, water content, and crop chloro- phyll [37,38]. We generated thirty VIs from the original spectral reflectance bands to identify vegetation and landcover classifications such as soil brightness, crop stress, wa- ter content, and crop chlorophyll [35,36]. Due to the limited spectral characteristics of the UAV data, only eleven UAV-based VIs were computed (marked with asterisks in Supplementary Table S1). We used the function spectral indices from the RStoolbox package within R software (Vienna, Austria) to calculate all the Vis [39,40]. 2.7. Image Classification The image classification was implemented based on the integration of various features and data fusion. We assessed the performance of the ML classification models with regard to the spectral bands, vegetation indices, and a combination of bands and indices derived from the UAV, S2A, and SAR data (Table 3). The numbers of predictor variables used for the different datasets for modeling purposes are shown in Table 3. Geotagged photos were collected as reference field data during the UAV flight missions. In addition, the on-screen digitization of the UAV RGB mosaics was performed with ArcGIS 10.7 software (ESRI, Redlands, CA, USA) [41] to obtain training polygons for different crop and landcover classes. Training sample polygons for each UAV flight site were obtained for seven landcover and crop type classes, including banana, cassava, maize, forest, grassland, buildings, and bare ground/dirt road. The crop type and landcover classifications were performed using the caret tool [42] within the R software [43]. The ground truth data were split 70:30 into training and testing datasets using the createDataPartition function of the caret package. Table 3. List of parameters used for the image classification. Data Series Data Combination Abbreviation Number of Predictor Variables Used 1. UAV spectral bands and height UAV-B 5 2. UAV spectral indices and height UAV-VI 12 3. UAV spectral bands, indices, and height UAV-BVI 16 4. UAV spectral bands and indices, excluding height BVI-H 15 5. S2A spectral bands S2B 10 6. S2A spectral indices S2VI 27 7. S2A spectral bands and indices S2BVI 37 8. SAR data SAR 16 9. S2A spectral bands, indices, and SAR data S2BVI-SAR 53 2.8. Machine Learning Algorithms Two commonly employed ML algorithms, RF and SVM, were explored for the banana and landcover classification in the target area using the UAV, Sentinel 2, and SAR data. The banana mapping workflow is shown in Figure 2. Remote Sens. 2022, 14, x FOR PEER REVIEW  8  of  24    Remote Sens. 2022, 14, 5206 7 of 22   Figure 2. Workflow ofF iugnumrea2n. Wneodrk afleowriaolf vuenhmiacnlen e(dUaAerVia)l, vseyhnictlhee(UtiAc Va)p, esyrntuthreet ircaadpaerrt u(SreArRad)a, ra(nSdA RS)e,nantidnSeel ntinel 2 data processing for 2badnataanpar omceasspinpginfogr. banana mapping. 2.8.1. Random Forest Classifier 2.8.1. Random Forest ClTahsesRifFiecrl as sification algorithm forms predictions by training several decision trees in The RF classifipcaartailolenl tahlrgoourgihthbmag gfionrgm[4s4 ]p, raepdroiccteisosnosf sbteyp twraisienbinoogt ssteravpeprianlg dfoelcloiswioedn btyreaegsg rega- in parallel through tbioang[g45in].gT h[4e4R],c aare pt praocckeasgse owfa sstseypstwemisaeti cbaollyotussterdabpypcionmgp faorlinlogwdieffder benyt aalggogrrieth‐ms to implement various ML methods. RF has two major parameters, namely mtry (the number gation  [45]. The R ocfavreatr iapbalecskraagned owmlayss asmysptleedmaatteiaccahllsyp liut)saendd bntyre ec(othmepnuamrinbegr odfitfrfeeersetno tg raolwg)o, ‐which rithms to  implemenarte vtuanreiodutos oMbtaLin maneathccoudrast.e RmFo dhela.sP atrwamo emteratjuonri npgawraams eexteecurste, dnianmoredleyr tmo otrpyti mize (the number of varitahbe lReFs mraonddelopmerlfyo rsmaamncpelewdit haitn etahcehR scparleitt)p aacnkdag ne.trTehee (ptharea mnuetmerbtuern ionfg tirneveoslv  ed a to grow), which are1 t0u-fnoleddc troos os-bvtaaliidna atinon aacncdurwaates  rmepoedateeld. Ptharricaemfoerteera cthumniondge lw. Iansa edxdeitciounte, tdh einr anger package was used for the RF model. Parameter tuning was used to optimize the mtry, while order to optimize ththee RntFre emwoads heel lpdearsfcoornmstaanntcaet 5w00it,hanind tthheem Rin cimaruemt pnaocdkeasigzee.w Tahse1 .pTahreaompteimtearl mtry tuning involved a 1w0‐afsoalcdh icervoedssw‐vitahltihdeabtieostna cacnudra cwyaasn drevpareiaedteadcc tohrdriinceg tfootrh eeancuhm mbeordofeiln. pIunt avdar‐iables. dition, the ranger package was used for the RF model. Parameter tuning was used to opti‐ mize the mtry, whil2e.8 .t2h.eS unptproeert wVeacsto hr eMldac haisn ecConlasstsaifinetr at 500, and the minimum node size  was 1. The optimal mtryS VwMasis aacmhiaecvheinde wleaitrhni nthgem beethsot dacbcauserdaocyn tahnedle avranriinegdt haecocroyrodfisntagt itsoti ctsh,ew here decision boundaries accounting for the maximum separation between features are estab- number of input valrisiahebdle. sF.o r a two-feature problem, the margin or separation equates to the sum distances to the hyperplane from the closest points of the two features [46]. The points closest to the 2.8.2. Support Vectodre cMisiaocnhbinouen Cdalaryssairfeiecra l led support vectors. SVM works seamlessly for linearly sepa- rable classes. However, the kernel concept is introduced for non-linear cases. The kernel SVM is a machtrinanes floeramrns itnhegc mlasestehsoindto baahsiegdh eornd itmheen lseioanrntoinengh tahneceorthye olifn esatar tsiesptaicrasb, iwlithye[4r6e, 47]. decision boundaries accTohuenSVtiMngc ofomrp trhisees mfoauxricmomumo nsleypeamraptloioyned bkeetrwneeletynp fees:atthuerreasd aiarleb aessitsafbun‐ ction lished. For a two‐fe(aRtBuFr)ea pndrotbhleelmine, atrh,ep omlyanrogminia lo, ra nsdepsiagrmatoiiodnf uenqcutiaontess.  Ttoh itshsetu sduymus dedistthaenGceaus ssian to the hyperplane frroadmia tlhbeas cislokseernste lpfuonincttiso nofw tihthei ntwthoe Rfecaatruetrpesac [k4a6g]e.. TThweo ppoarianmtse tcelross, ethset ctoos tth(Ce ) and decision boundary are called support vectors. SVM works seamlessly for linearly separa‐ ble classes. However,  the kernel concept  is  introduced  for non‐linear cases. The kernel  transforms the classes into a higher dimension to enhance the linear separability [46,47].    The SVM comprises four commonly employed kernel types: the radial basis function  (RBF) and the linear, polynomial, and sigmoid functions. This study used the Gaussian  radial basis kernel function within the R caret package. Two parameters, the cost (C) and  gamma (γ), were tuned using a 10‐fold cross‐validation and repeated three times to select  the best model.    Remote Sens. 2022, 14, 5206 8 of 22 gamma (γ), were tuned using a 10-fold cross-validation and repeated three times to select the best model. 2.9. Accuracy Assessment We employed the Confusion Matrix function in the caret package to assess the per- formance of the models. The procedure generated the overall accuracy (OA), producer accuracy (PA), user accuracy (UA), and kappa coefficient (KC) required to evaluate the model performance and accuracy of the resulting maps. Descriptions of OA and KC are written in the equations presented below: (TP + TN) OA = (TP + TN + FN + FP) − 1 − OA KC = 1 1 − Pr where TP and TN are the samples predicted as true positives and true negatives, FN and FP are false negatives and false positives, and Pr is the probability of chance agreement. The Wilcoxon rank-sum test, a nonparametric alternative two-sample t-test, was performed to compare the RF and SVM models across all datasets [48]. The Wilcoxon signed-rank test with continuity correction was carried out using the R software. In addition, the Kruskal–Wallis rank sum test, a pairwise comparison used to identify the significant pairings among the accuracies of the different datasets, was performed [49]. Variable importance to identify significant features contributing to the models’ per- formance was computed using the Boruta package in the R software [50]. This procedure enwraps the RF algorithm and iteratively eliminates the elements considered statistically in- significant for the classification of the model performance [51]. Several studies have shown that the Boruta package as one of the most accurate feature selection methods [52,53]. 2.10. Evaluation of the Banana Land Cover Maps for BBTV Surveillance The utility of the banana land cover map generated by the model established in the study was assessed for the purpose of BBTV surveillance. The GPS coordinates of the plantations evaluated in 2019 and 2020 were mapped in the landcover map to identify the occurrence of banana fields in the vicinity of disease-affected plantations for the BBTV assessment. A survey team identified about 40 plantations for BBTV assessment by visually examining the plants for typical BBTV symptoms in August 2021. The data collected were used to map the virus-affected and unaffected plantations. 3. Results 3.1. UAV Classification Performance across Locations and Datasets with the Two ML Models The classification performance parameters of the four UAV datasets, namely UAV-B, UAV-VI, UAV-BVI, and BVI-H, were compared across four UAV flight missions using the RF and SVM models (Tables 4 and 5). The overall accuracies (OAs) exceeded 89% (mean = 93%) for the datasets that included the height for both classifiers. The Kappa coefficients (KC) were equally high, with a mean of 0.89 and range between 0.85 and 0.93 for the datasets that included the vegetation height. However, when height was excluded from the input features of the two ML models, the changes in the OA values were substantial. Overall, the OA values decreased from 8% to about 20% for the BVI-H dataset across all the locations compared to the other UAV-derived datasets. The location appeared to impact the performance of the models as well. For instance, the KC values for the RF classifier declined from 0.87 to 0.69 at Olokuta and 0.91 to 0.75 at Igbeji (comparing UAV- BVI to BVI-H) (Table 4). We observed a similar declining trend in the OA and KC values using the SVM algorithm at all locations for the datasets in which height was excluded. Remote Sens. 2022, 14, 5206 9 of 22 Table 4. Performance of the Random Forest (RF) and Support Vector Machine (SVM) models applied to UAV datasets in four sites in Nigeria. RF SVM Site Metric UAV-B UAV-VI UAV-BVI BVI-H UAV-B UAV-VI UAV-BVI BVI-H Olokuta OA 89.9 88.9 90.0 77.8 89.4 89.3 89.4 77.0 KC 0.87 0.85 0.87 0.69 0.86 0.86 0.86 0.68 Igbeji OA 95.0 95.1 95.3 87.2 95.3 95.4 95.3 88.0 KC 0.91 0.91 0.91 0.75 0.91 0.91 0.91 0.76 Ipaja OA 91.9 92.4 91.9 70.7 92.4 92.4 92.4 73.9 Road KC 0.87 0.88 0.87 0.50 0.88 0.88 0.88 0.54 Ipaja OA 95.1 94.4 95.1 81.1 95.0 94.2 94.7 72.6 Town KC 0.93 0.91 0.92 0.71 0.92 0.91 0.92 0.59 Mean OA 93.0 92.7 93.1 79.2 93.0 92.8 93.0 77.9 Mean KC 0.89 0.89 0.89 0.64 0.89 0.89 0.89 0.65 UAV spectral bands (UAV-B); UAV spectral indices (UAV-VI); spectral bands and spectral indices (UAV-BVI); and spectral bands and indices excluding height (BVI-H). All datasets, except BVI-H, include crop height. OA = overall accuracy and KC = Kappa coefficient. 3.2. Class-Specific Classification Performance for the UAV Datasets Using the Two ML Models Using the BVI dataset for the crop classes (banana, cassava, and maize), we obtained moderately high UA values, which ranged from 77.4 to 83.3% for the RF model and from 54.1 to 74.7% for the SVM at the Olokuta site (Table 5). The range of the UA values at Ipaja Town showed a moderately accurate prediction, as it varied from 69.3 to 91.8% using the RF classifier and from 52.8 to 91.4 for the SVM. Furthermore, the PA values across the crop classes exhibited a similar pattern using the two classifiers for the BVI dataset. UA and PA values obtained for the banana crop were consistently higher than 69% for the RF classifier, outperforming the SVM classifier. Using the complete dataset (BVI), maize was the most accurately classified among the three crops, followed by cassava. For bananas, the exclusion of the canopy height (BVI-H) decreased UA values obtained from the RF model from 77.4 to 49.1% at Olokuta and 69.3 to 14.1% at Ipaja Town. Similarly, without BVI-H, the UA values obtained for the SVM classifier also declined from 74.7 to 35.4% at Olokuta and from 52.8% to 6.6% at Ipaja Town. Similar decreases were noted in the cassava class across all the reference locations. The inclusion of the UAV-estimated vegetation height data considerably improved the crop type classification in the study area. Table 5. UAV user accuracy (UA) and producer accuracy (PA) for Random Forest (RF) and Support Vector Machine (SVM) model performance. Bare Site Model Metric Dataset Banana Building Cassava Forest Grassland Maize Ground/ Road UA BVI 77.4 98.3 78.0 91.5 53.5 83.3 96.2 BVI-H 49.1 95.6 36.3 89.2 35.1 52.9 60.7 RF PA BVI 70.9 99.3 72.0 88.8 71.7 83.1 96.5 Olokuta BVI-H 61.8 88.2 62.6 72.4 55.6 61.1 82.2 UA BVI 74.7 97.6 71.5 91.0 61.4 54.1 96.5 BVI-H 35.4 90.1 34.9 94.8 27.6 44.5 75.2 SVM PA BVI 74.3 99.2 75.2 88.4 63.0 81.1 94.0 BVI-H 68.1 91.9 68.6 68.0 60.5 56.5 72.4 UA BVI 69.3 99.9 80.4 97.2 95.5 91.8 99.6 BVI-H 14.1 96.5 32.6 89.9 75.6 93.4 98.4 RF PA BVI 77.5 100.0 81.9 97.3 94.6 91.9 99.8 Ipaja BVI-H 36.6 96.3 61.2 81.2 80.8 80.7 98.6 Town UA BVI 52.8 99.7 79.6 97.2 95.7 91.4 99.7 BVI-H 6.6 97.8 31.3 86.6 55.2 97.1 70.7 SVM PA BVI 80.6 99.9 80.2 97.1 94.2 90.3 99.8 BVI-H 21.2 57.6 45.9 77.5 73.1 60.1 98.8 BVI = spectral bands and indices, BVI-H = spectral bands excluding height (BVI-H). All datasets, except BVI-H, include crop height. The two ML models performed well in predicting the three non-crop classes (buildings, forests, and bare ground/roads), with UA and PA values always higher than 95% based on the BVI dataset. Among the non-crop classes, grassland was the most difficult to classify. The UA and PA values ranged between 53.5% and 71.7% at Olokuta for the BVI dataset and declined further after excluding the vegetation height (BVI-H). The spectral profile of the Remote Sens. 2022, 14, 5206 10 of 22 UAV multispectral reflectance bands extracted for the major landcover types is presented in Supplementary Figure S2. 3.3. UAV RF and SVM Confusion Matrices by Crop Type and Other Land Use Types The classification performance according to the class categories for the Olokuta site was representative of the other sites. Therefore, only the confusion matrix for this site is pre- sented here (Table 6). For the crop classes (banana, cassava, and maize), misclassifications were uncommon with respect to the building and bare ground/road classes. However, the RF and SVM models often misclassified the forest class as bananas. This was not a surprise, considering that the cultivation of bananas often takes place under or near forest canopies with other tree crops, such as oil palm, citrus, cocoa, and evergreen deciduous tree species. The prediction maps generated by the RF and SVM classifiers are comparable (Figure 3). The relative intensity of the banana presence per location corresponds to the ground-level observations during the field mission. Both the RF and SVM classifiers predicted a similar areal range of bananas at all sites. For the Igbeji site, the RF estimated the banana land area as 44.74 ha, while SVM estimated it as 43.18 ha. At Ipaja Town, the RF estimated an area of 11.49 ha for the banana class, while SVM estimated 11.06 ha (Table 7; Figure 3). Table 6. Confusion matrices of RF and SVM for the complete UAV dataset at the Olokuta site in Nigeria. Random Forest (RF) Bare Banana Building Cassava Forest Grassland Maize Ground/ PA Road Banana 18,832 598 203 3134 3532 8 241 0.71 Building 27 81,934 0 23 28 0 464 0.99 Cassava 62 0 6186 1060 1270 11 0 0.72 Forest 4703 35 555 57,573 1737 0 249 0.89 Grassland 622 136 990 1097 8076 253 96 0.72 Maize 26 3 1 0 256 1551 29 0.83 Bare ground/road 45 680 0 43 184 38 26,987 0.96 UA 0.77 0.98 0.78 0.91 0.54 0.83 0.96 OA:90.0 and KC:87.0 Support Vector Machine (SVM) Banana 18,173 206 87 2527 3300 0 168 0.74 Building 16 81418 0 6 56 0 570 0.99 Cassava 36 5 5677 984 845 1 0 0.75 Forest 4822 133 1075 57,256 1364 0 93 0.88 Grassland 1150 331 1086 1959 9267 796 131 0.63 Maize 40 86 10 3 61 1007 35 0.81 Bare ground/road 80 1207 0 195 190 57 27,069 0.94 UA 0.75 0.98 0.72 0.91 0.61 0.54 0.96 OA:89.4 and KC:86.0 PA = performance accuracy, KC = Kappa coefficient. Table 7. Estimated banana area (ha) based on UAV and Sentinel 2 + SAR data. Site UAV RF UAV SVM S2SAR RF S2SAR SVM Igbebji 44.7 43.2 14.7 13.0 Olokuta 55.3 63.3 59.8 66.3 Ipaja Road 10.7 7.7 7.4 7.8 Ipaja Town 11.5 11.1 22.7 24.8 UAV RF and S2SAR RF = Random Forest (RF) models of UAV and S2SAR data, respectively. UAV SVM and S2SAR = Support Vector Machine (SVM) models of UAV and S2SAR data, respectively. Remote Sens. 2022, 14, x FOR PEER REVIEW  12  of  24    Table 6. Confusion matrices of RF and SVM for the complete UAV dataset at the Olokuta site in  Nigeria.  Random Forest (RF)    Bare Ground/  Banana  Building  Cassava  Forest  Grassland  Maize  PA  Road  Banana  18,832  598  203  3134  3532  8  241  0.71  Building  27  81,934  0  23  28  0  464  0.99  Cassava  62  0  6186  1060  1270  11  0  0.72  Forest  4703  35  555  57,573  1737  0  249  0.89  Grassland  622  136  990  1097  8076  253  96  0.72  Maize  26  3  1  0  256  1551  29  0.83  Bare  45  680  0  43  184  38  26,987  0.96  ground/road  UA  0.77  0.98  0.78  0.91  0.54  0.83  0.96      OA:90.0 and KC:87.0    Support Vector Machine (SVM)  Banana  18,173  206  87  2527  3300  0  168  0.74  Building  16  81418  0  6  56  0  570  0.99  Cassava  36  5  5677  984  845  1  0  0.75  Forest  4822  133  1075  57,256  1364  0  93  0.88  Grassland  1150  331  1086  1959  9267  796  131  0.63  Maize  40  86  10  3  61  1007  35  0.81  Bare  80  1207  0  195  190  57  27,069  0.94  ground/road  UA  0.75  0.98  0.72  0.91  0.61  0.54  0.96      OA:89.4 and KC:86.0    PA = performance accuracy, KC = Kappa coefficient.  Table 7. Estimated banana area (ha) based on UAV and Sentinel 2 + SAR data.  Site  UAV RF  UAV SVM  S2SAR RF  S2SAR SVM  Igbebji  44.7  43.2  14.7  13.0  Olokuta  55.3  63.3  59.8  66.3  Ipaja Road  10.7  7.7  7.4  7.8  Ipaja Town  11.5  11.1  22.7  24.8  Remote Sens. 2022, 14, 5206 11 of 22 UAV RF and S2SAR RF = Random Forest (RF) models of UAV and S2SAR data, respectively. UAV  SVM and S2SAR = Support Vector Machine (SVM) models of UAV and S2SAR data, respectively.    FFiigguurree 33.. MMooddeell pprreeddiiccttiioonn mmaappss ffoorr ((aa)) IIggbbeejjii,, ((bb)) OOllookkuuttaa,, ((cc)) IIppaajjaa RRooaadd,, aanndd ((dd)) IIppaajjaa TToowwnn uussiinngg  Random Forest (1) and Support Vector Machine (2) at four UAV flight sites. The predicted banana  Random Forest (1) and Support Vector Machine (2) at four UAV flight sites. The predicted banana crop area (ha) is indicated on the maps.  crop area (ha) is indicated on the maps. 3.4. Random Forest and Support Vector Machine Classification Performance for Different Sentinel   2A and SAR Datasets Compared to the classification results achieved with the Sentinel 2A and SAR datasets, the S2BVI-SAR dataset performed best, with OA values of 89.8% and 89.0% for the RF and SVM, respectively (Table 8). This is followed by the classification performance based on the S2B datasets, with 88 and 86% OA values for the RF and SVM, respectively. Although the Kruskal–Wallis test suggested that the OA values were significantly different between the five Sentinel 1 and 2 datasets (p value < 0.00424), the observed significance was mainly due to OA differences between S2BVI-SAR and SAR. The RF classification algorithm generally outperformed the SVM for the five datasets with respect to the OA and KC, and the differ- ences were significant according to the Wilcoxon signed-rank test (p value < 0.005761). The lowest prediction performance was observed when only the SAR dataset was processed as an input to the ML models. Overall, the S2BVI-SAR dataset generated the best classification performance (OA = 89.8). The classification accuracies per class varied, with UA values ranging from 100% for the water class to 55.7% for bananas. Using the S2BVI-SAR dataset, the banana class was the most accurately predicted among the three crop classes, with UA values of 83 and 74% and PA values of 77.7% and 72.8% for the RF and SVM classifiers, respectively. The cassava class under this dataset configuration was the second most successfully classified crop type. The SAR dataset generated low classification performance metrics. Specifically, the SVM classified the banana class poorly, as the UA decreased from 74% to 28% and the PA from 72% to 49%, relative to the accuracy achieved using the S2BVI-SAR dataset. The performance was better with the RF classifier, as the UA value decreased from 83% to 64.3%. Similar to the results of the UAV data, the confusion matrix (Supplementary Table S2) shows that the banana was confused with the forest class. The spectral profiles of the Sentinel 2A multispectral reflectance bands for the major landcover types are presented in Supplementary Figure S3. Our models accurately delineated roads, bare ground, water, and built-up areas on the maps. Using the two maps (Figure 4), clusters of banana farms were identified around the northeastern and southeastern parts of the study region. Banana plantations were frequently detected as being concentrated around built-up areas, in correlation with our field-observed knowledge, as backyard banana farming was common in the study area. The estimated banana area predicted by the RF model was 2210 ha, slightly lower than the 2262 ha predicted by the SVM classifier for the entire study area of 32,500 ha (Figure 4). Maize and cassava were the two most essential arable crops in the study area Remote Sens. 2022, 14, 5206 12 of 22 and were scattered throughout the study sites. Both models performed accurately in the discrimination of crop and landcover types in the study area based on the visual inspection Remote Sens. 2022, 14, x FOR PEER REVIEW  of both predicted maps. To illustrate the mapping accuracy, we conducted a co1m4 poafr i2so4n  of   the predictions of RF and SVM using the UAV and combined Sentinel 2A and SAR data in the Olokuta site, where banana production is high compared to the other sites, assessed using UAV (Figure 5). UAV RF and S2SAR RF = Random Forest (RF) models of UAV and S2SAR data, respectively. UAV  Table 8. User accuracy (UA) and producer accuracy (PA) of the Sentinel 2A and SAR integrated SVM and S2SAR = Support Vector Machine (SVM) models of UAV and S2SAR data, respectively.  dataset experiments. OA = observed accuracy; KC = Kappa coefficient.  Dataset Model Metric Banana Building Cassava Forest Grassland Maize Bare Ground/Road Water OA KC Our mUAodels 7a2c.8curat6e7l.y2  deli7n5.e5ated 9r2o.9ads, b8a1.r4e grouRF 67n.0d, wate6r2, .6and bui1l0t0‐up a8r8e.0as on0.8 5 S2B the maps.P UA sing th76e.0 two m85.1aps (F7i5g.5ure 49)1,. 9cluster8s0. 3of ban6a2n.4a farms6 0w.4ere iden10t0ified around  SVM UA 51.5 64.1 66.0 93.2 79.2 70.2 60.4 100 85.9 0.82 the northePaAstern a5n9.0d sou7t9h.6easte7r1n.7 parts9 1o.5f the s7t8u.3dy reg5i9o.8n. Bana5n4a.9 plantat1i0o0ns were fre‐ quReFntly dUeA PAtected  7 6a 1. 7.s 1 bein6g8.0  conc7e0n.56 87.0 69.5trated 93 .992.4around 79.4 60 82 .2built‐u58p .7 .9  areas,  6i1n.958.1  correla 1t0i0on w8i7t.h3  ou0r.8 4 S2VI 100 fieSlVdM‐obseU Pr A Aved kn 5o7.0 60w.4 ledg 6e4.,883.0 as ba 61c.5kyard9 2b.7anana7 7f.a7 rming72 .w8 as com62m.6on in th10e0 stud8y5.9 area0..82 66.5 90.9 80.2 58.2 60.4 100   The estimUaAted ban74a.5na ar6e4a.1 pred75i.c0ted b9y3. 6the RF79 .m0 odel 6w1.8as 22106 3h.3a, slight1l0y0  low8e7r.6 thanRF 0.8 4 S2BVI PA 68.4 84.5 71.1 92.6 82.8 63.8 56.1 100 the 2262 hUaA predic55t.e7d by6 t4h.1e SV6M3.5 class9i3f.i1er for 7t7h.3e entir7e0. 7study a6r1e.2a of 32,51000 ha 8(5F.8igurSVM 0e.8 2 PA 60.6 80.4 69.8 90.7 79.4 58.2 57.0 100 4). Maize and cassava were the two most essential arable crops in the study area and were  RF UA 64.3 46.9 65.5 84.6 48.8 54.5 24.5 100 77.3 0.70 SAR scattered PtAhrough7o6.u3 t the8 3s.3tudy 6s3i.t9es. B6o9t.5h mod5e1.l0s perfo77r.6med acc3u5.1rately in1 0t0he discrimi‐ SVM UA 28.1 49.2 55.5 84.4 48.6 50.3 27.3 99.5 74.1 0.66 nation of cPrAop and4 9l.a3ndco6v2.4er typ5e3.s4 in th6e7 .s4tudy a51r.e7a base7d0.6 on the v35i.s5ual insp99e.9ction of both  prReFdicted UPm A A aps. T83o.0 illus6t8r.a0te th7e8 .5mapp9i3n.7g accu81r.a8 cy, w7e6 .4 65.5 100 89.8 0.87 S2BVI- 77.7 84.5 78.1 93.0 84.8 72.3conduct6e1.d9  a com1p0a0 rison of the  SAR prSeVdMictionUsA of RF7 a4.0nd SV64M.8  usin76g.5 the U94A.1 V and82 PA 72.8 80.6 73.2 92.9 86  .4 75.4 .c2ombin67e.6d Sentin 64.0 63e.6l 2A and 100 10 0SAR  8d9.a0 ta i0n.8 6 the Olokuta site, wUhAeVrReF banadnSa2nSAaR pRrFo=dRuancdtoimonFo iress th(RigF)hm codoemls opfaUrAeVda ntdoS t2hSAeR odtahtae, rre sspietcetivse, lya.sUsAeVssSeVdM and using UAV (FigureS 25S)A.R   = Support Vector Machine (SVM) models of UAV and S2SAR data, respectively. OA = observed accuracy; KC = Kappa coefficient.   Figure 4. Prediction mFiagpurse o4f. tPhree dciocmtiobninmaatpisonof otfh ethceo mopbitnicaatilo nanodf  tShAe oRp dticaatlaasnedtsS bAyR Rdaadtaosmets FboyrResatd (olmefFt)o rest and Support Vector M(leaftc)hainndeS (urpigpohrtt)V mecotodreMlsa cinhi nthee(r iIgdhot)lomgoudnel sreingtihoenI,d Oolgoguunn srteagtieon i,nO Nguingestraitae. in Nigeria.   RRememotoeteS eSnesn.s2. 022022,21, 41,45, 2x0 F6OR PEER REVIEW  1153 ooff 2224      FFigiguurere5 5. .C Coommppaarrisisoonno offt htheep prreeddicictitoionnsso offt htheeR RaannddoommF Foorreesstt( R(RFF))a annddS SuuppppoorrttV VecetcotorrM Maachchininee  (S(SVVMM))m mooddeelslsf oforrt htheeU UAAVVa nanddS eSnentitninelel2 A2Ad dataat,a,f ufusesdedw witihthS SAARRd dataata, ,i nina ah higighh-d‐denensistiytyb baannaannaa  production site (Olokuta, Nigeria).  production site (Olokuta, Nigeria). 33.5.5. .U Usseeo offa aB BaannaannaaP PrerdedicictotorrM Maappf oforrB BBTTVVS Suurrvveeyyss    TThheeG GPPSSc cooorrddininaatetesso offt htheel oloccaatitoionnsso offt htheeb baannaannaaf faarrmmssi ninvveessttiiggaatteedd ffoorr BBBBTTV oocc-‐ ccuurrrreenncceeb beetwtweeenn 22001199 aanndd2 2002200w weererem maapppeeddo onntotot htheep prereddicicteteddm maapp, ,d deevveeloloppeedd bbyy  ccoommbbininininggt htheeo opptitcicaal la annddS SAARRd daatatasseetstsu ussininggt htheeb baannaannaam maappppininggw woorrkkflfolowwe esstatabblilsishheedd  inint hthisis tsutduydy(F (iFguigruer6e) .6T).h Tehper epdriecdteidctemda pmraepv eraelveedasledv esreavl ebraanl abnaanfiaenlad sfiienldths einla nthdes clapned,‐ mscaanpyeo, fmwahniyc hofw werheicnho twienrvee sntoigt aintevdesfotirgBatBeTdV fodru BriBnTgVt hdeuprirnegv itohues psurervieoyuss.  Ssurveys. cSounr-‐ dvuecytes dcoindabuoctuetd4 0inn aebwousitt e4s0 sneelewct esidteuss sinelgecthteedb uansiannga tphree dbaicnteadnam parped(mictaerdk emdawpi t(hmbalrakcekd  cwirciltehs binlaFckig cuirecl6e)s ininA Fuigurset 260) 2i1n rAevuegaulesdt 2B0B2T1V reovcecaulrerde nBcBeTinV 1o7cncuewrresnitcees i(nT a1b7l en9e)w. T shitiess  c(aTsaebilelu 9s)t.r aTtheiss tchaeseb eilnluefistrsaotefst htheep bredniecftietsd omf athpef opretdheicrteadti omnapl  pfolarn tnhien graotifosnuarlv peliallnanicneg  locfa stuiornvse,ililnacnluced ilnocgatthioenisd,e intcilfiucdaitniogn tohfe siidtesnitnifitchaetivoinci onfit syitoefs tihne tdhies evaisceinwitiyt hoft htheeg rdeiasteeasste  nweeitdhf tohret hgereiamtepslte nmeendt afotiro tnhoe fimcopnlteaminemnetanttiomne oafs ucorenst.ainment measures.  TTabablele9 .9S. Suummmmarayryo of fb baannaannaafi feiledlds istietsess usurvrveyeyededf oforrt htheeo occucurrerennceceo of fB BBTTVVi nint htheeI dIdoolologguunnr ereggioionn  oof fO OgguunnS Statatete, ,N Nigigeeriraia. .  Plantations with  Plantatio YYeeaarr  Plantations with Plantationnss wwiitthhoouutt  Total Plantations  BBBBTTVV  BBBBTTVV  Total Plantations 2021 *  17  23  40  2021 * 17 23 40 22002200  111177  9933  221100  22001199  3377  1133  5500  Total  171  129  300  Total 171 129 300 * Survey sites identified using the banana map generated from the mapping framework established  * Survey sites identified using the banana map generated from the mapping framework established in this study. in this study.  The ML models established in this study were inadequate for identifying BBTV- The ML models established in this study were inadequate for identifying BBTV‐in‐ infected shoots using UAV or satellite imagery due to insignificant differences in the sfpeecctterda lsphrooofitsle us.siTnhge UBABTVV osry smatpetlolimte simcoangsetirtyu tdeuteh etos einvseirgensihfiocratnetn dinifgfeorfetnhceesp sineu tdheo sstpemec‐ atnrdal ppertoifoilleess.a Tnhde nBaBrTroVw siynmgpotfotmhes lceoanfsltaitmutine ath, we siethveprael eshyoerltleonwinmg aorfg tihnes p(Fseiguudroest7e)m. T ahned  inpfeetcitoeldess haonodt snoafrtreonwcionegx iostf wthiteh laeamf ilxamofiansay, mwpitthom paatliec yoerlmloowd emraatreglyinssy m(Fpigtuomrea 7ti)c. sThhoeo tisn,‐ wfehcitcehda  rsehoshortso uodfteedn bcyotehxeiscta  nwoitphy  ao fmtailxl- gorfo wasiynmgpshtoomotastiacn  dore  smcaopdeerraetaedlyy dsyemtecpttioomn abtyic  ssahteololittse, awnhdicdhr oanree simhraoguedryed( Fbiyg uthree 8c)a.nHopowy eovf etar,lli‐mgraogwesinogf  tshheosoytsm apntdo mesactaicpep lraenatdsyo dnettheec‐ gtrioounn bdy lseavteelllcilteea arnlyde dxrpoonsee itmheagseyrmyp (Ftoigmuarteic 8p).l Hanotws (eFviegru, riem7a)g. eHso owf tehvee rs,ycmappttuorminagtiicm palagnests  froonm thteh egrgoruoundnd leavnegl lceleisarilmy peoxspsoibsele thwei tshymthpetUomAVatflici gphlatnptast (hFiugsuerde i7n).t hHeoswtuedvye.r,T chaeprteuforirneg,   images from the ground angle is impossible with the UAV flight path used in the study.    Remote Sens. 2022, 14, x FOR PEER REVIEW  16  of  24    Remote Sens. 2022, 14, 5206 14 of 22 wTheelirmeiftoerdeo, uwr eef floimrtsittoedth oeuacrc uefrfaotertids etnot ifithcaet iaocncoufrtahteeb iadneannatipfilcaanttisoann douf stehoef bcraonpana plants and  musaep sotfo cgruoipde mthaepdsi steoa sgeusuidrvee tilhlaen dcei.sease surveillance.    Fiigguurere6 .6P. rPedreicdtiiocntiofnb aonfa bnaanaanndao tahnerdl aontdhecorv learntdyp ceosvonert hteypsietess onf B tBhTeV soitcecsu rorefn BceBiTnVth oeccurrence in the  IIddoolloogguunnre greiognioinn Ninig eNriiag. eNreiaw. sNuervwey saurrevasefyo raBreBaTsV fsourr vBeBillTanVc esuidrevnetiifilleadnucsei nigdethnetibfaiendan ausing the banana  mmaappppinigngfr afmraemwoerwkodrekv edloepveedloinptehdis isntu tdhyisa rsetsuhdowy narine bslhacokwcnirc ilnes .black circles.  3.6. Feature Importance of the Predictor Variables The ranking of the predictor variables of RF (Supplementary Figure S1a–d) shows that the dual polarization image acquired in November (NOV-VH) was the best predictor of the crop types and landcover in the study area, followed by the images obtained in August (AUG-VV and AUG-VH). The RED band was the most influential predictor when Sentinel 2A bands were processed as the input data for the RF model (Supplementary Figure S1b). The shortwave infrared bands (SWIR1 (1.55–1.75 µm) and SWIR2 (2.08–2.35 µm) were the second-best predictive indicators for the crop type and landcover mapping of the study area. Following these were the GREEN and REDEDGE1 bands. Unexpectedly, the BLUE band ranked higher than the remaining two REDEDGE bands, probably due to water bodies in the study area.   Remote Sens. 2022, 14, 5206 15 of 22 Remote Sens. 2022, 14, x FOR PEER REVIEW  17  of  24    Remote Sens. 2022, 14, x FOR PEER REVIEW  17  of  24      Figure 7. A few examples of banana bunchy top virus (BBTV)‐infected banana mats in the farmers’  Figfuierleds7 .anAdf beawckeyxaardms pinle tsheo fstbuadnya anreaab, cuanpcthuryedto ipn Dviercuesm(bBeBr T20V2)0-.i nPsfeeuctdeodstbemansa wniathm tyaptsicianl BthBeTVfa  rmers’ fielsdys and backyar s in the study area, captured in December 2020. Pseudostems with typical BBTV Figmuprteo 7m. sA a freew in edxiacmatepdle isn o rfe bda cniarcnlae sb, uanncdh ays ytompp vtiormusa t(iBcB/uTnVin)‐fiencfteecdte bda nbaannaan sah omoatsts a irne  tihned ifcaartmede risn’   symyfiepelltldoswm a nsredac rbteaancigknlydeasirc. daBstB einTd Vtih‐nsey rsmetudpdtcyoi mracraleetaisc, , cpaalpnatdnutrase sdayr imen  spDetevocemermealybt iecsrt/ u2u0n2nte0id.n P,f eswecuittedhdo nsbatearmrnoasw nw alietsahhf t oylaopmitcsianala rB,e BaiTnVdd  icated in yrseeyslmleompwtbolrmee sscu taacrknee girnlsed (siyc.oaButeBndTg i Vsni -drseeyd sm hcioprocttloesm se, matneidrcg apinslyagm nfrtpostmoamr tehaesti epcv/sueenureidnloyfestcsetemudn  bbtaeasdnea,).nw Iami stahgoneosat swr raeorrewe i tnaldekaiecfnal tmaedma niin‐n a, and uyealllloyw u sriencgt aan 1g2l‐ems. eBgBapTiVx‐esly RmGpBto camera from the ground level.  resemble suckers (young side shmoaottisc epmlanetrsg ainreg sfervoemreltyh setupnsteeud,d wositthe mnarbraoswe )l.eaIfm  laamgeinsa,w aenrde  taken manreusaemllyblue ssiuncgkears1 (2y-omuengga spidixee slhRooGtsB ecmaemrgeirnagf frroomm tthhee pgsreouudnodstelmev beal.se). Images were taken man‐ ually using a 12‐megapixel RGB camera from the ground level.      Figure 8. UAV RGB images of banana fields captured at an altitude of ~100 m during UAV flight   missions in December 2020 in the Olokuta site in the Idologun region of Ogun State, Nigeria. The BBTV-infected plants were identified based on the ground survey in December 2020. The locations of   BBTV-infected plants are shown with red circles. Remote Sens. 2022, 14, 5206 16 of 22 Supplementary Figure S1c presents the importance scores of the vegetation indices derived from the Sentinel 2A optical data as the input to the RF model. The two most significant features in classifying the landcover types in the study area were the two shortwave-based vegetation indices, the normalized burn ratio index (NBRI) and nor- malized difference water index 2 (NDWI2). The third most important indicator was the modified chlorophyll absorption ratio index (MCARI), based on red, red-edge, and green bands. This result was not unexpected, since the red, green, and red-edge bands held the top ranks among the influential sentinel 2A bands. Closely following the first three ranks, the subsequent two vegetation indices substantially contributing to the model predictions were the soil-adjusted total vegetation index (SATVI) and the specific leaf area vegetation index (SLAVI). Both are similarly related to the shortwave and red bands. Firstly, these results demonstrate the significance of the two shortwave infrared bands (SWIR1 and SWIR2) included in the Sentinel 2A satellite. Secondly, the red band played a critical role in the model performance in the study area. Consequently, these bands’ importance featured prominently in their indices. Finally, the normalized difference vegetation index (NDVI) was moderately beneficial to the land use and crop type classification in the study area. Importance ranks among the UAV-data-derived features are shown in Supplementary Figure S1d. The vegetation height was the most significant layer that influenced the crop type prediction performance in the study area by a wide margin. Fol- lowing this were the REDEDGE, NIR, and GREEN bands, in that order of importance. The most influential vegetation index, MCARI, derived from the REDEDGE, green and red bands, ranked fifth among the key predictive indicators. Surprisingly, the popular NDVI was not among the top ten crop type and land cover predictors in this study, as it ranked 12th. Recently, authors have noted the superior significance of the other spectral indices compared to the traditionally known NDVI [26,54]. 4. Discussion Banana is a crop of social and economic importance in SSA. However, its production is adversely affected by many emerging pests and pathogens, such as BBTV, banana bacterial wilt, and fusarium tropical race 4. Timely and accurate surveillance is needed to prevent the spread of these emerging diseases in Africa. However, this requires a methodological workflow that can be used to identify and detect banana plantations in heterogeneous smallholder farming systems for targeted surveillance, risk prediction, and modeling. In this study, we explored UAV data and other satellite products, such as Sentinel 2A and SAR, to predict the location of banana crops, leveraging the ML classifier models. 4.1. Banana Detection with UAV Data Using RF and SVM Models The multispectral UAV data classification showed that banana can be mapped accu- rately using ML models and other land cover classes, such as buildings, cassava, forests, grassland, maize, and bare ground/roads. The classification outputs are reliable and use- able, notwithstanding the different combinations of UAV data (including the vegetation indices, spectral bands, and crop heights derived from the UAV, DSM, and DTM). For instance, the mean, with the inclusion of the vegetation height, was 93% for both classi- fiers, while the KC was around 0.89. These results are consistent with previous research findings [26], which reported a high mean accuracy (86%) when identifying crop classes with deep neural networks and transfer learning using UAV-based imagery acquired in smallholder agricultural lands in Rwanda. Similarly, a high OA value (97%) was reported after applying the ML models combined with UAV and other satellite imagery products to detect banana plants under mixed, complex African landscapes [27]. Additionally, the convolutional neural network (CNN) technique was used to detect bananas in Thailand, with accuracies ranging from 75.8 to 96.4 [55]. Deep learning methods were used to ex- tract apple tree crowns with UAV data, with 91 and 94% accuracies [56]. The accuracies obtained in our study are comparable to those of these studies. For instance, the OAs of the different variants of the UAV data varied from 89 to 95% for both RF and SVM, whereas Remote Sens. 2022, 14, 5206 17 of 22 the integration of Sentinel 2A and SAR resulted in the OAs of 89.8 and 89.0 for RF and SVM, respectively. Our findings suggest that any of the datasets could be utilized to achieve accurate UAV-based predictions of banana and other crop types in the diverse smallholder farming system in the study area. However, the processing times increased significantly as the number of bands increased for the model training and the prediction of very-high-spatial- resolution (10–15 cm) UAV data. Using the minimum dataset of UAV-B (four layers of UAV data and height) would likely be more efficient for the operational use of UAV data for banana plant detection compared to the UAV-VI (15 layers of spectral indices and canopy heights) or the combination of both spectral bands and VI (19 bands). However, ML models with only spectral bands and indices, without height data, failed to produce a reliable accuracy, as the OA and KC reduced significantly. The mean OA across the four sites decreased from around 93% to 78%, and the KC values likewise declined from 0.89 to about 0.64 when the vegetation height was excluded from the input features for the RF and SVM classifiers. Furthermore, the performance of the two models showed no remarkable differences between the three datasets that incorporated height measurements, suggesting that using UAV multispectral bands and vegetation indices separately or in combination does not improve the classification accuracies, as demonstrated in this study. However, the integration of the height measurement into the UAV multispectral imagery significantly enhanced the performance of the two classifiers. Most of the previous studies that have used UAV-derived data for crop or land use classification have focused on spectral and vegetation indices, with little or no emphasis on the canopy height data. However, this study found that canopy height data from UAV, DSM, and DTM proved remarkably significant in improving the classification accuracy. Similar observations were reported in previous reports on the use of height and coverage indicators for crop growth monitoring [16,57]. Kedia et al. [54] reported that incorporating the UAV-derived canopy height feature considerably increased the OA from 80 to 93% while mapping invasive vegetation species in arid regions of the USA. There is a noticeable variation in the canopy height of bananas compared to other crops and landcover categories in the study area. Such contrasts can influence the reflectance signals from the canopy, with pronounced discriminating features that can complement the color spectra of the vegetation. This supports the notion that structural features and spectral profiles are essential for banana detection and rapid mapping. This study’s feature importance ranking shows that the most widespread vegetation index (NDVI) was not a critical discriminator of the crop and landcover types. This is likely due to the saturation of the red spectral band when the vegetation classes are in the peak green period during the cropping season [58]. Therefore, the other spectral indices, such as MCARI, SAVI, and GNDVI, based on a combination of green, red-edge, and red spectral bands, were better predictors of the landcover classification. 4.2. Crop Type and Landcover Classification with Sentinel 2A and SAR Data We successfully utilized the different spectral and vegetation indices of Sentinel 1 and 2 data to detect and classify bananas in heterogenous agro-ecological landscapes. By isolating the shortwave infrared, red, and green bands as critical spectral features for the crop classification in the study area, we narrowed down the relevant indices (such as the NBRI, SLAVI, SATVI, and NDWI2) for the prediction of crop/land cover classes. Several studies have highlighted the efficacy of the shortwave infrared bands in landcover discrimination [59]. SWIR-based indices are sensitive to vegetation structures and can highlight substantial dynamic changes [60]. Although all the S2A bands and indices tested were applicable to the classification of crops in the study, the NIR and associated indices were among the least important indicators. The potential of high-resolution UAV-RGB aerial images for simultaneous banana localization and disease classification, with an accuracy of 90 to 99%, was demonstrated in Benin, and DR Congo [27], which shows the feasibility of remote sensing approaches Remote Sens. 2022, 14, 5206 18 of 22 to disease detection in the field. However, the model developed was insufficient for identifying banana plantations with BBTV-symptomatic shoots using the UAV and SAR data. The canopy cover of healthy banana plants or weeds and wild plants often shroud the severely stunted BBTV-infected shoots, leading to an insufficient resolution for infected plant detection based on aerial imagery. The detection infected plants using UAV or SAR imagery may be possible in monoculture plantations due to the better exposure of symptomatic plants. However, we could not test this hypothesis due to the lack of monoculture farms with BBTV infection in the study area. The best overall classification accuracy of the banana crops in this large area was achieved using optical Sentinel 2A and SAR data. A similar approach was used to classify winter wheat in southern China, with a 98% accuracy [51]. Regarding the model perfor- mance, the classification metrics of the RF classifiers were slightly better than those of SVM. This could be associated with the model’s capacity to generate multiple paths with different variables (as tree ensembles) to optimize the prediction and discrimination within and between classes [52]. The application of these tools and integration of S2A and SAR datasets provide a promising outlook for the monitoring of the banana production area in order to target relevant areas for banana disease surveillance on a regional or national scale. The banana mapping model developed aided in the selection of survey sites and guided the surveillance efforts for the early detection and eradication of BBTV in Togo [7], as well as the subsequent surveillance design used to verify that other banana production regions were free of BBTV. 5. Conclusions In this study, we developed a mapping framework for banana detection in a small- holder complex system using UAV, Sentinel 2A, and SAR data. UAV images were used to create spectral orthomosaics and develop a digital surface model, a digital terrain model, and a canopy height model. From the UAV spectral features, we derived a suite of vegeta- tion indices and developed the RF and SVM models with or without the canopy height in order to distinguish between banana, cassava, maize, and other landcover types. The RF and SVM models with vegetation height features performed with an average OA of 93%, while the model without the canopy height exhibited a much lower OA of 78%. From this observation, we conclude that structural height features are essential for crop delineation using the UAV-based predictors. We used Sentinel 2A optical and SAR data to improve banana detection on a regional scale. We computed several vegetation indices and developed various RF and SVM models from the suite of resulting datasets. The SAR data alone resulted in a classification accuracy of around 76%, compared to the 90% accuracy achieved by integrating the optical and SAR data. These findings suggest that the prediction of banana, along with other crops, in mixed, complex smallholder systems is feasible, with a reasonable level of precision necessary to guide targeted BBTV surveillance. Further studies are necessary in order to improve the model capacity, so as to differentiate between BBTV symptomatic and asymptomatic plantations. Supplementary Materials: The following supporting information can be downloaded at: https://www. mdpi.com/article/10.3390/rs14205206/s1. Refs. [61–82] are cited in the Supplementary Materials file. Author Contributions: Conceptualization, methodology and investigation, T.R.A. and P.L.K.; soft- ware, data curation, and analysis T.R.A., J.A., O.P.D. and P.L.K.; original draft preparation, review, and editing, T.R.A., J.A. and P.L.K.; project administration and funding acquisition, P.L.K. All authors have read and agreed to the published version of the manuscript. Funding: This work was supported by the CGIAR Research Program on Roots, Tubers, and Banana (CRP-RTB) and the CGIAR Plant Health Initiative, supported by the CGIAR Trust Fund Donors and the University of Queensland Project on “BBTV mitigation: Community Management in Nigeria and Screening Wild Banana Progenitors for Resistance (OPP1130226)”, funded by the Bill & Melinda Gates Foundation (BMGF). Remote Sens. 2022, 14, 5206 19 of 22 Data Availability Statement: Not applicable. Acknowledgments: The authors thank Oviasuyi Taiwo of the Virology Unit of IITA, Ibadan, Nigeria, for his assistance with the drone flying missions in the field. The authors gratefully acknowledge the University of Queensland and the Bill & Melinda Gates Foundation for supporting the open-access fees through BMGF grant No. OPP1130226. We thank the anonymous reviewers for their insightful comments and suggestions that helped us to clarify the manuscript. Conflicts of Interest: The authors declare no conflict of interest. References 1. Brown, A.; Tumuhimbise, R.; Amah, D.; Uwimana, B.; Nyine, M.; Mduma, H.; Talengera, D.; Karamura, D.; Kuriba, J.; Swennen, R. Bananas and Plantains (Musa Spp.). In Genetic Improvement of Tropical Crops; Springer: Cham, Switzerland, 2017; pp. 219–240. [CrossRef] 2. UNPD. Household: Size and Composition 2018—Countries. Available online: https://population.un.org/household/#/ countries/840 (accessed on 25 July 2022). 3. FAO. FAOSTAT. Available online: https://www.fao.org/faostat/en/#data/ (accessed on 25 July 2022). 4. Kumar, P.L.; Selvarajan, R.; Iskra-Caruana, M.L.; Chabannes, M.; Hanna, R. Biology, Etiology, and Control of Virus Diseases of Banana and Plantain. In Advances in Virus Research; Elsevier: Amsterdam, The Netherlands, 2015; Volume 91, pp. 229–269. [CrossRef] 5. Lokossou, B.; Gnanvossou, D.; Ayodeji, O.; Akplogan, F.; Safiore, A.; Migan, D.Z.; Pefoura, A.M.; Hanna, R.; Kumar, P.L. Occurrence of Banana Bunchy Top Virus in Banana and Plantain (Musa Spp.) in Benin. New Dis. Rep. 2012, 25, 13. [CrossRef] 6. Adegbola, R.O.; Ayodeji, O.; Awosusi, O.O.; Atiri, G.I.; Kumar, P.L. First Report of Banana Bunchy Top Virus in Banana and Plantain (Musa Spp.) in Nigeria. Plant Dis. 2013, 97, 290. [CrossRef] [PubMed] 7. Kolombia, Y.; Oviasuyi, T.; AYISAH, K.D.; Ale Gonh-Goh, A.; Atsu, T.; Oresanya, A.; Ogunsanya, P.; Alabi, T.; Kumar, P.L. First Report of Banana Bunchy Top Virus in Banana (Musa Spp.) and Its Eradication in Togo. Plant Dis. 2021, 105, 3312. [CrossRef] [PubMed] 8. Ocimati, W.; Tazuba, A.F.; Tushemereirwe, W.K.; Tugume, J.; Omondi, B.A.; Acema, D.; Were, E.; Onyilo, F.; Ssekamate, A.M.; Namanya, P.; et al. First Report of Banana Bunchy Top Disease Caused by Banana Bunchy Top Virus in Uganda. New Dis. Rep. 2021, 44, e12052. [CrossRef] 9. Shimwela, M.M.; Mahuku, G.; Mbanzibwa, D.R.; Mkamilo, G.; Mark, D.; Mosha, H.I.; Pallangyyo, B.; Fihavango, M.; Oresanya, A.; Ogunsanya, P.; et al. First Report of Banana Bunchy Top Virus in Banana and Plantain (Musa Spp.) in Tanzania. Plant Dis. 2022, 106, 1312. [CrossRef] 10. Pu, R.; Landry, S. Remote Sensing of Environment A Comparative Analysis of High Spatial Resolution IKONOS and WorldView-2 Imagery for Mapping Urban Tree Species. Remote Sens. Environ. 2012, 124, 516–533. [CrossRef] 11. Hossain, M.D.; Chen, D. Segmentation for Object-Based Image Analysis (OBIA): A Review of Algorithms and Challenges from Remote Sensing Perspective. ISPRS J. Photogramm. Remote Sens. 2019, 150, 115–134. [CrossRef] 12. Aeberli, A.; Johansen, K.; Robson, A.; Lamb, D.W.; Phinn, S. Detection of Banana Plants Using Multi-Temporal Multispectral UAV Imagery. Remote Sens. 2021, 13, 2123. [CrossRef] 13. Karydas, C.; Dimic, G.; Filchev, L.; Chabalala, Y.; Adam, E.; Adem Ali, K. Machine Learning Classification of Fused Sentinel-1 and Sentinel-2 Image Data towards Mapping Fruit Plantations in Highly Heterogenous Landscapes. Remote Sens. 2022, 14, 2621. [CrossRef] 14. TaTariq, A.; Yan, J.; Gagnon, A.S.; Khan, M.R.; Mumtaz, F. Mapping of Cropland, Cropping Patterns and Crop Types by Combining Optical Remote Sensing Images with Decision Tree Classifier and Random Forest. Geo-Spat. Inf. Sci. 2022. [CrossRef] 15. Mei, W.; Wang, H.; Fouhey, D.; Zhou, W.; Hinks, I.; Gray, J.M.; van Berkel, D.; Jain, M. Using Deep Learning and Very-High- Resolution Imagery to Map Smallholder Field Boundaries. Remote Sens. 2022, 14, 3046. [CrossRef] 16. Maimaitijiang, M.; Sagan, V.; Sidike, P.; Daloye, A.M.; Erkbol, H.; Fritschi, F.B. Crop Monitoring Using Satellite/UAV Data Fusion and Machine Learning. Remote Sens. 2020, 12, 1357. [CrossRef] 17. Maxwell, A.E.; Warner, T.A.; Fang, F. Implementation of Machine-Learning Classification in Remote Sensing: An Applied Review. Int. J. Remote Sens. 2018, 39, 2784–2817. [CrossRef] 18. Rodriguez-Galiano, V.F.; Ghimire, B.; Rogan, J.; Chica-Olmo, M.; Rigol-Sanchez, J.P. An Assessment of the Effectiveness of a Random Forest Classifier for Land-Cover Classification. ISPRS J. Photogramm. Remote Sens. 2012, 67, 93–104. [CrossRef] 19. Saini, R.; Ghosh, S.K. Crop Classification on Singled Dates Sentinel-2 Imagery Using Random Forest and Support Vector Machine. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, XLII–5, 20–23. 20. Feng, S.; Zhao, J.; Liu, T.; Zhang, H.; Zhang, Z.; Guo, X. Crop Type Identification and Mapping Using Machine Learning Algorithms and Sentinel-2 Time Series Data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 3295–3306. [CrossRef] 21. Sun, C.; Bian, Y.; Zhou, T.; Pan, J. Using of Multi-Source and Multi-Temporal Remote Sensing Data Improves Crop-Type Mapping in the Subtropical Agriculture Region. Sensors 2019, 19, 2401. [CrossRef] 22. Al-Najjar, H.A.H.; Kalantar, B.; Pradhan, B.; Saeidi, V.; Halin, A.A.; Ueda, N.; Mansor, S. Land Cover Classification from Fused DSM and UAV Images Using Convolutional Neural Networks. Remote Sens. 2019, 11, 1461. [CrossRef] Remote Sens. 2022, 14, 5206 20 of 22 23. Iqbal, N.; Mumtaz, R.; Shafi, U.; Zaidi, S.M.H. Gray Level Co-Occurrence Matrix (GLCM) Texture Based Crop Classification Using Low Altitude Remote Sensing Platforms. PeerJ Comput. Sci. 2021, 7, e536. [CrossRef] 24. Gumma, M.K.; Tummala, K.; Dixit, S.; Collivignarelli, F.; Holecz, F.; Kolli, R.N.; Whitbread, A.M. Crop Type Identification and Spatial Mapping Using Sentinel-2 Satellite Data with Focus on Field-Level Information. Geocarto Int. 2020, 37, 1833–1849. [CrossRef] 25. Johansen, K.; Sohlbach, M.; Sullivan, B.; Stringer, S.; Peasley, D.; Phinn, S. Mapping Banana Plants from High Spatial Resolution Orthophotos to Facilitate Plant Health Assessment. Remote Sens. 2014, 6, 8261–8286. [CrossRef] 26. Chew, R.; Rineer, J.; Beach, R.; O’neil, M.; Ujeneza, N.; Lapidus, D.; Miano, T.; Hegarty-Craver, M.; Polly, J.; Temple, D.S. Deep Neural Networks and Transfer Learning for Food Crop Identification in UAV Images. Drones 2020, 4, 7. [CrossRef] 27. Selvaraj, G.M.; Vergara, A.; Montenegro, F.; Alonso Ruiz, H.; Safari, N.; Raymaekers, D.; Ocimati, W.; Ntamwira, J.; Tits, L.; Omondi, A.B.; et al. Detection of Banana Plants and Their Major Diseases through Aerial Images and Machine Learning Methods: A Case Study in DR Congo and Republic of Benin. ISPRS J. Photogramm. Remote Sens. 2020, 169, 110–124. [CrossRef] 28. Hall, O.; Dahlin, S.; Marstorp, H.; Bustos, M.F.A.; Öborn, I.; Jirström, M. Classification of Maize in Complex Smallholder Farming Systems Using UAV Imagery. Drones 2018, 2, 22. [CrossRef] 29. Eberhardt, I.D.R.; Schultz, B.; Rizzi, R.; Sanches, I.D.A.; Formaggio, A.R.; Atzberger, C.; Mello, M.P.; Immitzer, M.; Trabaquini, K.; Foschiera, W.; et al. Cloud Cover Assessment for Operational Crop Monitoring Systems in Tropical Areas. Remote Sens. 2016, 8, 219. [CrossRef] 30. Abubakar, G.A.; Wang, K.; Shahtahamssebi, A.; Xue, X.; Belete, M.; Gudo, A.J.A.; Shuka, K.A.M.; Gan, M. Mapping Maize Fields by Using Multi-Temporal Sentinel-1A and Sentinel-2A Images in Makarfi, Northern Nigeria, Africa. Sustainability 2020, 12, 2539. [CrossRef] 31. Blaes, X.; Vanhalle, L.; Defourny, P. Efficiency of Crop Identification Based on Optical and SAR Image Time Series. Remote Sens. Environ. 2005, 96, 352–365. [CrossRef] 32. Batjes, N.H. Overview of Procedures and Standards in Use at ISRIC WDC-Soils. ISRIC-World Soil information, Wageningen, The Netherlands. 2017. Available online: https://www.isric.org/sites/default/files/isric_report_2017_01doi.pdf (accessed on 3 March 2022). 33. Fick, S.E.; Hijmans, R.J. WorldClim 2: New 1-Km Spatial Resolution Climate Surfaces for Global Land Areas. Int. J. Climatol. 2017, 37, 4302–4315. [CrossRef] 34. Böhler, J.E.; Schaepman, M.E.; Kneubühler, M. Crop Classification in a Heterogeneous Arable Landscape Using Uncalibrated UAV Data. Remote Sens. 2018, 10, 1282. [CrossRef] 35. Kucharczyk, M.; Hay, G.J.; Ghaffarian, S.; Hugenholtz, C.H. Geographic Object-Based Image Analysis: A Primer and Future Directions. Remote Sens. 2020, 12, 2012. [CrossRef] 36. Lee, J.; Pottier, E. Polarimetric Radar Imaging: From Basics to Applications; Vienna, Austria. 2017, pp. 1–398. Available online: https://www.taylorfrancis.com/books/mono/10.1201/9781420054989/polarimetric-radar-imaging-jong-sen-lee-eric- pottier (accessed on 12 February 2022). 37. Yeom, J.; Jung, J.; Chang, A.; Ashapure, A.; Maeda, M.; Maeda, A.; Landivar, J. Comparison of Vegetation Indices Derived from UAV Data for Differentiation of Tillage Effects in Agriculture. Remote Sens. 2019, 11, 1548. [CrossRef] 38. Sonobe, R.; Yamaya, Y.; Tani, H.; Wang, X.; Kobayashi, N.; Mochizuki, K. Crop Classification from Sentinel-2-Derived Vegetation Indices Using Ensemble Learning. J. Appl. Remote Sens. 2018, 12, 026019. [CrossRef] 39. Leutner, B.; Horning, N.; Schwalb-Willmann, J.; Hijmans, R.J. Tools for Remote Sensing Data Analysis-Package ‘RStoolbox’; CRAN; R-Project: Vienna, Austria, 2019. 40. Suab, S.A.; Avtar, R. Unmanned Aerial Vehicle System (UAVS) Applications in Forestry and Plantation Operations: Experiences in Sabah and Sarawak, Malaysian Borneo. In Unmanned Aerial Vehicle: Applications in Agriculture and Environment; Avtar, R., Watanabe, T., Eds.; Springer: Cham, Switzerland, 2020. [CrossRef] 41. ESRI. ArcGIS Desktop: Release 10.7.1; Environmental Systems Research Institute: Redlands, CA, USA, 2019. 42. Kuhn, M.; Wing, J.; Weston, S.; Williams, A.; Keefer, C.; Engelhardt, A.; Cooper, T.C.; Mayer, Z.; Kenkel, B.; Benesty, M.; et al. Package ‘Caret’—Classification and Regression Training version 6.0-93 2022. Available online: https://cran.r-project.org/web/ packages/caret/caret.pdf (accessed on 12 February 2022). 43. R Core Team. R: A Language and Environment for Statistical Computing; R Foundation for Statistical Computing: Vienna, Aus- tria, 2020. 44. Misra, S.; Li, H. Noninvasive Fracture Characterization Based on the Classification of Sonic Wave Travel Times. In Machine Learning for Subsurface Characterization; Elsevier: Amsterdam, The Netherlands, 2019; pp. 243–287. [CrossRef] 45. Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [CrossRef] 46. Pal, M.; Mather, P.M. Support Vector Machines for Classification in Remote Sensing. Int. J. Remote Sens. 2005, 26, 1007–1011. [CrossRef] 47. Duke, O.P.; Alabi, T.; Neeti, N.; Adewopo, J. Comparison of UAV and SAR Performance for Crop Type Classification Using Machine Learning Algorithms: A Case Study of Humid Forest Ecology Experimental Research Site of West Africa. Int. J. Remote Sens. 2022, 43, 4259–4286. [CrossRef] 48. Haynes, W. Wilcoxon Rank Sum Test. In Encyclopedia of Systems Biology; Springer: New York, NY, USA, 2013; pp. 2354–2355. [CrossRef] Remote Sens. 2022, 14, 5206 21 of 22 49. Kruskal-Wallis Test. The Concise Encyclopedia of Statistics; Springer: New York, NY, USA, 2008; pp. 288–290. [CrossRef] 50. Kursa, M.B.; Rudnicki, W.R. Package ‘Boruta’-Wrapper Algorithm for All Relevant Feature Selection 2022. Available online: https://cran.r-project.org/web/packages/Boruta/Boruta.pdf (accessed on 12 February 2022). 51. Kursa, M.B.; Rudnicki, W.R. Feature Selection with the Boruta Package. J. Stat. Softw. 2010, 36, 1–13. [CrossRef] 52. Sanchez-Pinto, L.N.; Venable, L.R.; Fahrenbach, J.; Churpek, M.M. Comparison of Variable Selection Methods for Clinical Predictive Modeling. Int. J. Med. Inform. 2018, 116, 10–17. [CrossRef] 53. Speiser, J.L.; Miller, M.E.; Tooze, J.; Ip, E. A Comparison of Random Forest Variable Selection Methods for Classification Prediction Modeling. Expert Syst. Appl. 2019, 134, 93–101. [CrossRef] 54. Kedia, A.C.; Kapos, B.; Liao, S.; Draper, J.; Eddinger, J.; Updike, C.; Frazier, A.E. An Integrated Spectral–Structural Workflow for Invasive Vegetation Mapping in an Arid Region Using Drones. Drones 2021, 5, 19. [CrossRef] 55. Neupane, B.; Horanont, T.; Hung, N.D. Deep Learning Based Banana Plant Detection and Counting Using High-Resolution Red-Green-Blue (RGB) Images Collected from Unmanned Aerial Vehicle (UAV). PLoS ONE 2019, 14, e0223906. [CrossRef] 56. Wu, J.; Yang, G.; Yang, H.; Zhu, Y.; Li, Z.; Lei, L.; Zhao, C. Extracting Apple Tree Crown Information from Remote Imagery Using Deep Learning. Comput. Electron. Agric. 2020, 174, 105504. [CrossRef] 57. Alabi, T.R.; Abebe, A.T.; Chigeza, G.; Fowobaje, K.R. Estimation of Soybean Grain Yield from Multispectral High-Resolution UAV Data with Machine Learning Models in West Africa. Remote Sens. Appl. 2022, 27, 100782. [CrossRef] 58. Gitelson, A.A. Wide Dynamic Range Vegetation Index for Remote Quantification of Biophysical Characteristics of Vegetation. J. Plant Physiol. 2004, 161, 165–173. [CrossRef] 59. Boonprong, S.; Cao, C.; Chen, W.; Bao, S. Random Forest Variable Importance Spectral Indices Scheme for Burnt Forest Recovery Monitoring-Multilevel RF-VIMP. Remote Sens. 2018, 10, 807. [CrossRef] 60. Nioti, F.; Xystrakis, F.; Koutsias, N.; Dimopoulos, P. A Remote Sensing and GIS Approach to Study the Long-Term Vegetation Recovery of a Fire-Affected Pine Forest in Southern Greece. Remote Sens. 2015, 7, 7712–7731. [CrossRef] 61. Gitelson, A.A.; Gritz, Y.; Merzlyak, M.N. Relationships between Leaf Chlorophyll Content and Spectral Reflectance and Algo- rithms for Non-Destructive Chlorophyll Assessment in Higher Plant Leaves. J. Plant Physiol. 2003, 160, 271–282. [CrossRef] [PubMed] 62. Perry, C.R.; Lautenschlager, L.F. Functional Equivalence of Spectral Vegetation Indices [Species, Leaf Area, Stress, Biomass, Multispectral Scanner Measurements, Landsat, Remote Sensing]. Available online: https://agris.fao.org/agris-search/search.do? recordID=US19850043085 (accessed on 16 August 2021). 63. Richardson, A.J.; Weigand, C. Distinguishing Vegetation from Soil Background Information. Photogrammetric Eng. Remote Sens. 1977, 43, 1541–1552. 64. Huete, A.; Justice, C. Modis Vegetation Index (MOD13) Algorithm Theoretical Basis Document. Available online: https: //modis.gsfc.nasa.gov/data/atbd/atbd_mod13.pdf (accessed on 12 February 2022). 65. Pinty, B.; Verstraete, M.M. GEMI: A Non-Linear Index to Monitor Global Vegetation from Satellites. Vegetation 1992, 101, 15–20. [CrossRef] 66. Gitelson, A.A.; Viña, A.; Ciganda, V.; Rundquist, D.C.; Arkebauer, T.J. Remote Estimation of Canopy Chlorophyll Content in Crops. Geophys. Res. Lett. 2005, 32, L08403. [CrossRef] 67. Gitelson, A.A.; Merzlyak, M.N. Remote Sensing of Chlorophyll Concentration in Higher Plant Leaves. Adv. Space Res. 1998, 22, 689–692. [CrossRef] 68. Daughtry, C.S.T.; Walthall, C.L.; Kim, M.S.; De Colstoun, E.B.; McMurtrey, J.E. Estimating Corn Leaf Chlorophyll Concentration from Leaf and Canopy Reflectance. Remote Sens. Environ. 2000, 74, 229–239. [CrossRef] 69. Xu, H. Modification of Normalised Difference Water Index (NDWI) to Enhance Open Water Features in Remotely Sensed Imagery. Int. J. Remote Sens. 2007, 27, 3025–3033. [CrossRef] 70. Qi, J.; Chehbouni, A.; Huete, A.R.; Kerr, Y.H.; Sorooshian, S. A Modified Soil Adjusted Vegetation Index. Remote Sens. Environ. 1994, 48, 119–126. [CrossRef] 71. García, M.J.L.; Caselles, V. Mapping Burns and Natural Reforestation Using Thematic Mapper Data. Geocarto Int. 2008, 6, 31–37. [CrossRef] 72. Barnes, E.M.; Clarke, T.R.; Richards, S.E.; Colaizzi, P.D.; Haberland, J.; Kostrzewski, M.; Waller, P.; Choi, C.; Riley, E.; Thompson, T.; et al. Coincident Detection of Crop Water Stress, Nitrogen Status and Canopy Density Using Ground Based Multispectral Data. In Proceedings of the Fifth International Conference on Precision Agriculture, Bloomington, MN, USA, 16–19 July 2000. 73. Rouse, J.W.; Hass, R.H.; Schell, J.A.; Deering, D.W. Monitoring Vegetation Systems in the Great Plains with ERTS. NASA. Goddard Space Flight Center 3d ERTS-1 Symp., Vol. 1, Sect. A Jan 1. 1974. Available online: https://ntrs.nasa.gov/api/citations/19740022 614/downloads/19740022614.pdf (accessed on 16 August 2021). 74. McFeeters, S.K. The Use of the Normalized Difference Water Index (NDWI) in the Delineation of Open Water Features. Int. J. Remote Sens. 2007, 17, 1425–1432. [CrossRef] 75. Gao, B.C. NDWI—A Normalized Difference Water Index for Remote Sensing of Vegetation Liquid Water from Space. Remote Sens. Environ. 1996, 58, 257–266. [CrossRef] 76. Baret, F.; Guyot, G. Potentials and Limits of Vegetation Indices for LAI and APAR Assessment. Remote Sens. Environ. 1991, 35, 161–173. [CrossRef] Remote Sens. 2022, 14, 5206 22 of 22 77. Marsett, R.C.; Qi, J.; Heilman, P.; Biedenbender, S.H.; Watson, M.C.; Amer, S.; Weltz, M.; Goodrich, D.; Marsett, R. Remote Sensing for Grassland Management in the Arid Southwest. Rangel. Ecol. Manag. 2006, 59, 530–540. [CrossRef] 78. Huete, A.R. A Soil-Adjusted Vegetation Index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [CrossRef] 79. Lymburner, L.; Beggs, P.J.; Jacobson, C.R. Estimation of Canopy-Average Surface-Specific Leaf Area Using Landsat TM Data. Photogramm. Eng. Remote Sens. 2000, 66, 183–191. 80. Birth, G.S.; McVey, G.R. Measuring the Color of Growing Turf with a Reflectance Spectrophotometer1. Agron. J. 1968, 60, 640–643. [CrossRef] 81. Thiam, A.K. Geographic Information Systems and Remote Sensing Methods for Assessing and Monitoring Land Degradation in the Sahel: The Case of Southern Mauritania. Ph.D. Thesis, Clark University, Worcester, MA, USA, 1997. 82. Deering, D.W.; Rouse, J.W.; Haas, R.H.; Schell, J.A. Measuring “Forage Production” of Grazing Units from Landsat MSS Data. In Proceedings of the 10th International Symposium on Remote Sensing of Environment, Ann Arbor, MI, USA, 6 October 1975; pp. 1169–1178.