Skip to main navigation menu Skip to main content Skip to site footer

Review Articles

Vol. 12 No. sp4 (2025): Recent Advances in Agriculture by Young Minds - III

Enhancing agronomical attribute detection through RPAS-based image analysis - A review

DOI
https://doi.org/10.14719/pst.10261
Submitted
27 June 2025
Published
31-10-2025

Abstract

The integration of Remotely Piloted Aircraft Systems (RPAS) into agronomical research has revolutionized the collection and analysis of spatial and temporal crop data. RPAS, equipped with high-resolution multispectral, thermal and visible cameras, provide a cost-effective and flexible alternative to traditional laboratory and manual methods. This review synthesizes recent advancements in RPAS-based image analysis for agronomic applications, with a focus on crop monitoring, weed detection, biomass estimation and yield prediction. A critical evaluation of the published information reveals that most studies utilize low-altitude flights with commercial drones integrated with sensors capable of capturing data with high spatial resolution. Image processing techniques, such as vegetation indices, machine learning algorithms and object-based image analysis, are commonly employed to extract biophysical and biochemical parameters. The review of the literature demonstrates a strong correlation between RPAS-derived metrics and ground-based measurements, validating their utility in precision agriculture. However, variability in sensor calibration, flight parameters and environmental conditions presents challenges to reproducibility and scalability. Overall, RPAS-based image analysis offers a promising avenue for enhancing data-driven decision-making in agriculture, contributing to more sustainable and efficient farming practices in the future.

References

  1. 1. Agarwal I. Dynamics of labour use in cotton farming in India: An economic appraisal. Asian J Agric Ext Econ Sociol. 2023;41(9):887-95. https://doi.org/10.9734/ajaees/2023/v41i92116
  2. 2. Food and Agricultural Organisation. Maize crop profile; 2020.
  3. 3. Letsoin SMA, Guth D, Herak D, Purwestri RC. Analysing Zea mays plant height using unmanned aerial vehicle (UAV) RGB based on digital surface models (DSM). In: IOP Conference Series: Earth and Environmental Science; 2023. Bristol, UK: IOP Publishing; 2023. 1187(1):012028. https://doi.org/10.1088/1755-1315/1187/1/012028
  4. 4. Sweet DD, Tirado SB, Springer NM, Hirsch CN, Hirsch CD. Opportunities and challenges in phenotyping row crops using drone-based RGB imaging. Plant Phenom J. 2022;5(1):e20044. https://doi.org/10.1002/ppj2.20044
  5. 5. de Andrade Junior AS, da Silva SP, Setúbal IS, de Souza HA, de Paula PF, Vieira PF, et al. Remote detection of water and nutritional status of Glycine max using UAV-based images. Eng Agr. 2022;4430:9-23.
  6. 6. Yang S, Li J, Li J, Zhang X, Ma C, Liu Z, et al. Estimating the canopy nitrogen content in Zea mays by using the transform-based dynamic spectral indices and random forest. Sustainability. 2024;16(18):8011. https://doi.org/10.3390/su16188011
  7. 7. Li F, Miao Y, Feng G, Yuan F, Yue S, Gao X, et al. Improving estimation of summer Zea mays nitrogen status with red edge-based spectral vegetation indices. Field Crops Res. 2014;157:111-23. https://doi.org/10.1016/j.fcr.2013.12.018
  8. 8. Tunca E, Köksal E, Akay H, Öztürk E, Taner S. Novel machine learning framework for high-resolution Sorghum bicolor biomass estimation using multi-temporal UAV imagery. Int J Environ Sci Technol. 2025:1-16. https://doi.org/10.1007/s13762-025-06498-y
  9. 9. Dai J, König M, Jamalinia E, Hondula KL, Vaughn NR, Heckler J, et al. Canopy-level spectral variation and classification of diverse crop species with fine spatial resolution imaging spectroscopy. Remote Sens. 2024;16(8):1447. https://doi.org/10.3390/rs16081447
  10. 10. Mehedi IM, Bilal M, Hanif MS, Palaniswamy T, Vellingiri MT. Leveraging hyperspectral remote sensing imaging for agricultural crop classification using coot bird optimization with entropy-based feature fusion model. IEEE Access. 2024;12:130214-27. https://doi.org/10.1109/ACCESS.2024.3459793
  11. 11. Czech M, Le Moan S, Hernández-Andrés J, Müller B. Estimation of daylight spectral power distribution from uncalibrated hyperspectral radiance images. Opt Express. 2024;32(6):10392-407. https://doi.org/10.1364/OE.514991
  12. 12. Tan C, Chen Z, Liao A, Zeng X, Cao J. Accuracy analysis of UAV aerial photogrammetry based on RTK mode, flight altitude and number of GCPs. Meas Sci Technol. 2024;35(10):106310. https://doi.org/10.1088/1361-6501/ad5dd7
  13. 13. Michelon TB, Vieira ESN, Panobianco M. Spectral imaging and chemometrics applied at phenotyping in seed science studies: a systematic review. Seed Sci Res. 2023;33(1):9-22. https://doi.org/10.1017/S0960258523000028
  14. 14. Lu B, Dao PD, Liu J, He Y, Shang J. Recent advances of hyperspectral imaging technology and applications in agriculture. Remote Sens. 2020;12(16):2659. https://doi.org/10.3390/rs12162659
  15. 15. Rozenstein O, Cohen Y, Alchanatis V, Behrendt K, Bonfil DJ, Eshel G, et al. Data-driven agriculture and sustainable farming: friends or foes? Precis Agric. 2024;25(1):520-31. https://doi.org/10.1007/s11119-023-10061-5
  16. 16. Sishodia RP, Ray RL, Singh SK. Applications of remote sensing in precision agriculture: a review. Remote Sens. 2020;12(19):3136. https://doi.org/10.3390/rs12193136
  17. 17. Omia E, Bae H, Park E, Kim MS, Baek I, Kabenge I, et al. Remote sensing in field crop monitoring: a comprehensive review of sensor systems, data analyses and recent advances. Remote Sens. 2023;15(2):354. https://doi.org/10.3390/rs15020354
  18. 18. Maimaitijiang M, Sagan V, Sidike P, Daloye AM, Erkbol H, Fritschi FB. Crop monitoring using satellite/UAV data fusion and machine learning. Remote Sens. 2020;12(9):1357. https://doi.org/10.3390/rs12091357
  19. 19. Nielsen KM, Duddu HS, Bett KE, Shirtliffe SJ. UAV image-based crop growth analysis of 3D-reconstructed crop canopies. Plants. 2022;11(20):2691. https://doi.org/10.3390/plants11202691
  20. 20. Ma S, Zhou Y, Gowda PH, Dong J, Zhang G, Kakani VG, et al. Application of the water-related spectral reflectance indices: a review. Ecol Indic. 2019;98:68-79. https://doi.org/10.1016/j.ecolind.2018.10.049
  21. 21. Jiang J, Zhang Z, Cao Q, Liang Y, Krienke B, Tian Y, et al. Use of an active canopy sensor mounted on an unmanned aerial vehicle to monitor the growth and nitrogen status of Triticum aestivum. Remote Sens. 2020;12(22):3684. https://doi.org/10.3390/rs12223684
  22. 22. Maimaitijiang M, Ghulam A, Sidike P, Hartling S, Maimaitiyiming M, Peterson K, et al. Unmanned aerial system (UAS)-based phenotyping of Glycine max using multi-sensor data fusion and extreme learning machine. ISPRS J Photogramm Remote Sens. 2017;134:43-58. https://doi.org/10.1016/j.isprsjprs.2017.10.011
  23. 23. Maimaitijiang M, Sagan V, Erkbol H, Adrian J, Newcomb M, LeBauer D, et al. UAV-based Sorghum bicolor growth monitoring: a comparative analysis of lidar and photogrammetry. ISPRS Ann Photogramm Remote Sens Spat Inf Sci. 2020;3:489-96. https://doi.org/10.5194/isprs-annals-V-3-2020-489-2020
  24. 24. Gilliot J-M, Michelin J, Hadjard D, Houot S. An accurate method for predicting spatial variability of maize yield from UAV-based plant height estimation: a tool for monitoring agronomic field experiments. Precis Agric. 2021;22(3):897-921. https://doi.org/10.1007/s11119-020-09764-w
  25. 25. Falco N, Wainwright HM, Dafflon B, Ulrich C, Soom F, Peterson JE, et al. Influence of soil heterogeneity on Glycine max plant development and crop yield evaluated using time-series of UAV and ground-based geophysical imagery. Sci Rep. 2021;11(1):7046. https://doi.org/10.1038/s41598-021-86480-z
  26. 26. Luo S, Liu W, Zhang Y, Wang C, Xi X, Nie S, et al. Zea mays and Glycine max heights estimation from unmanned aerial vehicle (UAV) LiDAR data. Comput Electron Agric. 2021;182:106005. https://doi.org/10.1016/j.compag.2021.106005
  27. 27. Sapkota S, Paudyal DR. Growth monitoring and yield estimation of Zea mays using unmanned aerial vehicle (UAV) in a hilly region. Sensors. 2023;23(12):5432. https://doi.org/10.3390/s23125432
  28. 28. Kümmerer R, Noack PO, Bauer B. Using high-resolution UAV imaging to measure canopy height of diverse cover crops and predict biomass. Remote Sens. 2023;15(6):1520. https://doi.org/10.3390/rs15061520
  29. 29. Sun X, Yang Z, Su P, Wei K, Wang Z, Yang C, et al. Non-destructive monitoring of Zea mays LAI by fusing UAV spectral and textural features. Front Plant Sci. 2023;14:1158837. https://doi.org/10.3389/fpls.2023.1158837
  30. 30. Kobe M, Elias M, Merbach I, Schädler M, Bumberger J, Pause M, et al. Automated workflow for high-resolution 4D vegetation monitoring using stereo vision. Remote Sens. 2024;16(3):541. https://doi.org/10.3390/rs16030541
  31. 31. Rana S, Gerbino S, Akbari Sekehravani E, Russo MB, Carillo P. Crop growth analysis using automatic annotations and transfer learning in multi-date aerial images and ortho-mosaics. Agronomy. 2024;14(9):2052. https://doi.org/10.3390/agronomy14092052
  32. 32. Kaur S, Kakani VG, Carver B, Jarquin D, Singh A. Hyperspectral imaging combined with machine learning for high-throughput phenotyping in winter Triticum aestivum. Plant Phenom J. 2024;7(1):e20111. https://doi.org/10.1002/ppj2.20111
  33. 33. Cheng Q, Ding F, Xu H, Guo S, Li Z, Chen Z. Quantifying Zea mays LAI using machine learning and UAV multispectral imaging. Precis Agric. 2024;25(4):1777-99. https://doi.org/10.1007/s11119-024-10134-z
  34. 34. Zhang L, Zhang B, Zhang H, Yang W, Hu X, Cai J, et al. Multi-source feature fusion network for LAI estimation from UAV multispectral imagery. Agronomy. 2025;15(4):988. https://doi.org/10.3390/agronomy15040988
  35. 35. Hilty J, Muller B, Pantin F, Leuzinger S. Plant growth: the what, the how and the why. New Phytol. 2021;232(1):25-41. https://doi.org/10.1111/nph.17610
  36. 36. He W, Ye Z, Li M, Yan Y, Lu W, Xing G. Extraction of Glycine max plant trait parameters based on SfM-MVS algorithm combined with GRNN. Front Plant Sci. 2023;14:1181322. https://doi.org/10.3389/fpls.2023.1181322
  37. 37. Lu C, Gehring K, Kopfinger S, Bernhardt H, Beck M, Walther S, et al. Weed instance segmentation from UAV orthomosaic images based on deep learning. Smart Agric Technol. 2025:100966. https://doi.org/10.1016/j.atech.2025.100966
  38. 38. Vikram AK, Agrawal A, Ranjan A, Shinde S, Jalihale S, Singh A, et al. A comprehensive ortho-mosaic-based pipeline for enhanced automated Triticum aestivum ear-head detection and crop yield estimation. IEEE Trans AgriFood Electron; 2025. https://doi.org/10.1109/TAFE.2025.3563211
  39. 39. Watanabe K, Guo W, Arai K, Takanashi H, Kajiya-Kanegae H, Kobayashi M, et al. High-throughput phenotyping of Sorghum bicolor plant height using an unmanned aerial vehicle and its application to genomic prediction modeling. Front Plant Sci. 2017;8:421. https://doi.org/10.3389/fpls.2017.00421
  40. 40. Shu M, Li Q, Ghafoor A, Zhu J, Li B, Ma Y. Using the plant height and canopy coverage to estimate Zea mays aboveground biomass with UAV digital images. Eur J Agron. 2023;151:126957. https://doi.org/10.1016/j.eja.2023.126957
  41. 41. Chehreh B, Moutinho A, Viegas C. Latest trends on tree classification and segmentation using UAV data-a review of agroforestry applications. Remote Sens. 2023;15(9):2263. https://doi.org/10.3390/rs15092263
  42. 42. Goldsmith A, Austin R, Cahoon CW, Leon RG. Predicting Zea mays yield loss with crop-weed leaf cover ratios determined with UAS imagery. Weed Sci. 2025;73:e22. https://doi.org/10.1017/wsc.2025.3
  43. 43. Li W, Niu Z, Chen H, Li D. Characterizing canopy structural complexity for the estimation of Zea mays LAI based on ALS data and UAV stereo images. Int J Remote Sens. 2017;38(8-10):2106-16. https://doi.org/10.1080/01431161.2016.1235300
  44. 44. Herrero-Huerta M, Bucksch A, Puttonen E, Rainey KM. Canopy roughness: a new phenotypic trait to estimate aboveground biomass from unmanned aerial system. Plant Phenomics. 2020;2020:10. https://doi.org/10.34133/2020/6735967
  45. 45. Crusiol LGT, Sun L, Sun Z, Chen R, Wu Y, Ma J, et al. In-season monitoring of Zea mays leaf water content using ground-based and UAV-based hyperspectral data. Sustainability. 2022;14(15):9039. https://doi.org/10.3390/su14159039
  46. 46. Falcioni R, Antunes WC, Demattê JAM, Nanni MR. Reflectance spectroscopy for the classification and prediction of pigments in agronomic crops. Plants. 2023;12(12):2347. https://doi.org/10.3390/plants12122347
  47. 47. Crusiol LGT, Nanni MR, Furlanetto RH, Sibaldelli RNR, Cezar E, Mertz-Henning LM, et al. UAV-based thermal imaging in the assessment of water status of Glycine max plants. Int J Remote Sens. 2020;41(9):3243-65. https://doi.org/10.1080/01431161.2019.1673914
  48. 48. Cummings C, Miao Y, Paiao GD, Kang S, Fernández FG. Zea mays nitrogen status diagnosis with an innovative multi-parameter crop circle phenom sensing system. Remote Sens. 2021;13(3):401. https://doi.org/10.3390/rs13030401
  49. 49. Shao G, Han W, Zhang H, Liu S, Wang Y, Zhang L, et al. Mapping Zea mays crop coefficient Kc using random forest algorithm based on leaf area index and UAV-based multispectral vegetation indices. Agr Water Manag. 2021;252:106906. https://doi.org/10.1016/j.agwat.2021.106906
  50. 50. Setiyono T. Precise positioning in nitrogen fertility sensing in Zea mays. Sensors. 2024;24(16):5322. https://doi.org/10.3390/s24165322
  51. 51. Zhao H, Wang J, Guo J, Hui X, Wang Y, Cai D, et al. Detecting water stress in winter Triticum aestivum based on multifeature fusion from UAV remote sensing and stacking ensemble learning method. Remote Sens. 2024;16(21):4100. https://doi.org/10.3390/rs16214100
  52. 52. Lopes AdS, Andrade ASd, Bastos EA, Sousa CAd, Casari RAdC, Moura MSd. Assessment of Zea mays hybrid water status using aerial images from an unmanned aerial vehicle. Rev Caatinga. 2024;37:e11701. https://doi.org/10.1590/1983-21252024v3711701rc
  53. 53. Shi H, Liu Z, Li S, Jin M, Tang Z, Sun T, et al. Monitoring Glycine max soil moisture content based on UAV multispectral and thermal-infrared remote-sensing information fusion. Plants. 2024;13(17):2417. https://doi.org/10.3390/plants13172417
  54. 54. Niu H, Landivar J, Duffield N. Classification of Gossypium hirsutum water stress using convolutional neural networks and UAV-based RGB imagery. Adv Mod Agric. 2024;5(1):2457. https://doi.org/10.54517/ama.v5i1.2457
  55. 55. Chen Z, Chen H, Dai Q, Wang Y, Hu X. Estimation of soil moisture during different growth stages of summer Zea mays under various water conditions using UAV multispectral data and machine learning. Agronomy. 2024;14(9):2008. https://doi.org/10.3390/agronomy14092008
  56. 56. Augusti ML, Melo VF, Uchoa SCP, Francelino MR, Adandonon AV, Sounou AHG. Monitoring the spectral and agronomic behaviour of Zea mays in response to nitrogen fertilisation. Rev Ciênc Agron. 2024;56:e202391758. https://doi.org/10.5935/1806-6690.20250026
  57. 57. Gao Y, Zhao T, Zheng Z, Liu D. Gossypium hirsutum leaf water potential prediction based on UAV visible light images and multi-source data. Irrig Sci. 2025;43(1):121-34. https://doi.org/10.1007/s00271-024-00962-2
  58. 58. Wang Y, Wang J, Li J, Wang J, Xu H, Liu T, et al. Estimating Zea mays leaf water content using machine learning with diverse multispectral image features. Plants. 2025;14(6):973. https://doi.org/10.3390/plants14060973
  59. 59. Jay S, Baret F, Dutartre D, Malatesta G, Héno S, Comar A, et al. Exploiting the centimeter resolution of UAV multispectral imagery to improve remote-sensing estimates of canopy structure and biochemistry in Beta vulgaris. Remote Sens Environ. 2019;231:110898. https://doi.org/10.1016/j.rse.2018.09.011
  60. 60. Falcioni R, Antunes WC, Demattê JA, Nanni MR. Biophysical, biochemical and photochemical analyses using reflectance hyperspectroscopy and chlorophyll a fluorescence kinetics in variegated leaves. Biology. 2023;12(5):704. https://doi.org/10.3390/biology12050704
  61. 61. Ropelewska E. Application of imaging and artificial intelligence for quality monitoring of stored black currant (Ribes nigrum L.). Foods. 2022;11(22):3589. https://doi.org/10.3390/foods11223589
  62. 62. Wang D, Cao W, Zhang F, Li Z, Xu S, Wu X. A review of deep learning in multiscale agricultural sensing. Remote Sens. 2022;14(3):559. https://doi.org/10.3390/rs14030559
  63. 63. Guardado Yordi E, Koelig R, Matos MJ, Pérez Martínez A, Caballero Y, Santana L, et al. Artificial intelligence applied to flavonoid data in food matrices. Foods. 2019;8(11):573. https://doi.org/10.3390/foods8110573
  64. 64. Kior A, Sukhov V, Sukhova E. Application of reflectance indices for remote sensing of plants and revealing actions of stressors. Photonics. 2021;8(12):582. https://doi.org/10.3390/photonics8120582
  65. 65. Elazab A, Ordóñez RA, Savin R, Slafer GA, Araus JL. Detecting interactive effects of N fertilization and heat stress on Zea mays productivity by remote sensing techniques. Eur J Agron. 2016;73:11-24. https://doi.org/10.1016/j.eja.2015.11.010
  66. 66. Wang H, Singh KD, Poudel HP, Natarajan M, Ravichandran P, Eisenreich B. Forage height and above-ground biomass estimation by comparing UAV-based multispectral and RGB imagery. Sensors. 2024;24(17):5794. https://doi.org/10.3390/s24175794
  67. 67. Cilia C, Panigada C, Rossini M, Meroni M, Busetto L, Amaducci S, et al. Nitrogen status assessment for variable rate fertilization in Zea mays through hyperspectral imagery. Remote Sens. 2014;6(7):6549-65. https://doi.org/10.3390/rs6076549
  68. 68. Abulaiti Y, Sawut M, Maimaitiaili B, Chunyue M. A possible fractional order derivative and optimized spectral indices for assessing total nitrogen content in Gossypium hirsutum. Comput Electron Agric. 2020;171:105275. https://doi.org/10.1016/j.compag.2020.105275
  69. 69. Elmetwalli AH, Tyler AN. Estimation of Zea mays properties and differentiating moisture and nitrogen deficiency stress via ground-based remotely sensed data. Agr Water Manag. 2020;242:106413. https://doi.org/10.1016/j.agwat.2020.106413
  70. 70. Inoue Y, Guérif M, Baret F, Skidmore A, Gitelson A, Schlerf M, et al. Simple and robust methods for remote sensing of canopy chlorophyll content: a comparative analysis of hyperspectral data for different types of vegetation. Plant Cell Environ. 2016;39:2609-23. https://doi.org/10.1111/pce.12815
  71. 71. Guan K, Wu J, Kimball JS, Anderson MC, Frolking S, Li B, et al. The shared and unique values of optical, fluorescence, thermal and microwave satellite data for estimating large-scale crop yields. Remote Sens Environ. 2017;199:333-49. https://doi.org/10.1016/j.rse.2017.06.043
  72. 72. Martínez-Fernández J, González-Zamora A, Sánchez N, Gumuzzio A, Herrero-Jiménez C. Satellite soil moisture for agricultural drought monitoring: assessment of the SMOS derived soil water deficit Index. Remote Sens Environ. 2016;177:277-86. https://doi.org/10.1016/j.rse.2016.02.064
  73. 73. Trifilò P, Abate E, Petruzzellis F, Azzarà M, Nardini A. Critical water contents at leaf, stem and root level leading to irreversible drought-induced damage in two woody and one herbaceous species. Plant Cell Environ. 2023;46(1):119-32. https://doi.org/10.1111/pce.14469
  74. 74. Yang FF, Tao L, Wang QY, Du MZ, Yang TL, Liu DZ, et al. Rapid determination of leaf water content for monitoring waterlogging in winter Triticum aestivum based on hyperspectral parameters. J Integr Agric. 2021;20(10):2613-26. https://doi.org/10.1016/S2095-3119(20)63306-8
  75. 75. Zhang Y, Wu J, Wang A. Comparison of various approaches for estimating leaf water content and stomatal conductance in different plant species using hyperspectral data. Ecol Indic. 2022;142:109278. https://doi.org/10.1016/j.ecolind.2022.109278
  76. 76. Wang J, Chen C, Huang S, Wang H, Zhao Y, Wang J, et al. Monitoring of agricultural progress in rice-wheat rotation area based on UAV RGB images. Front Plant Sci. 2025;15:1502863. https://doi.org/10.3389/fpls.2024.1502863
  77. 77. Singh R, Krishnan P, Singh VK, Sah S, Das B. Combining biophysical parameters with thermal and RGB indices using machine learning models for predicting yield in yellow rust affected Triticum aestivum. Sci Rep. 2023;13(1):18814. https://doi.org/10.1038/s41598-023-45682-3
  78. 78. Han X, Wei Z, Chen H, Zhang B, Li Y, Du T. Inversion of winter Triticum aestivum growth parameters and yield under different water treatments based on UAV multispectral remote sensing. Front Plant Sci. 2021;12:609876. https://doi.org/10.3389/fpls.2021.609876
  79. 79. Kumar A, Desai SV, Balasubramanian VN, Rajalakshmi P, Guo W, Naik BB, et al. Efficient Zea mays tassel-detection method using UAV-based remote sensing. Remote Sens Appl Soc Environ. 2021;23:100549. https://doi.org/10.1016/j.rsase.2021.100549
  80. 80. Sharma P, Leigh L, Chang J, Maimaitijiang M, Caffé M. Above-ground biomass estimation in Avena sativa using UAV remote sensing and machine learning. Sensors. 2022;22(2):601. https://doi.org/10.3390/s22020601
  81. 81. Dhakal R, Maimaitijiang M, Chang J, Caffé M. Utilizing spectral, structural and textural features for estimating Avena sativa above-ground biomass using UAV-based multispectral data and machine learning. Sensors. 2023;23(24):9708. https://doi.org/10.3390/s23249708
  82. 82. Zhai W, Li C, Cheng Q, Mao B, Li Z, Li Y, et al. Enhancing Triticum aestivum above-ground biomass estimation using UAV RGB images and machine learning: multi-feature combinations, flight height and algorithm implications. Remote Sens. 2023;15(14):3653. https://doi.org/10.3390/rs15143653
  83. 83. Corti M, Cavalli D, Cabassi G, Bechini L, Pricca N, Paolo D, et al. Improved estimation of herbaceous crop aboveground biomass using UAV-derived crop height combined with vegetation indices. Precis Agric. 2023;24(2):587-606. https://doi.org/10.1007/s11119-022-09960-w
  84. 84. Ranđelović P, Đorđević V, Miladinović J, Prodanović S, Ćeran M, Vollmann J. High-throughput phenotyping for non-destructive estimation of Glycine max fresh biomass using a machine learning model and temporal UAV data. Plant Methods. 2023;19(1):89. https://doi.org/10.1186/s13007-023-01054-6
  85. 85. Bareth G, Hütt C, Jenal A, Bolten A, Kleppert I, Firl H, et al. Using UAV-derived plant height as an estimator for biomass and N-uptake. Int Arch Photogramm Remote Sens Spat Inf Sci. 2023;48:1867-72. https://doi.org/10.5194/isprs-archives-XLVIII-1-W2-2023-1867-2023
  86. 86. Liu X, Du R, Xiang Y, Chen J, Zhang F, Shi H, et al. Estimating winter Brassica napus aboveground biomass from hyperspectral images using narrowband spectra-texture features and machine learning. Plants. 2024;13(21):2978. https://doi.org/10.3390/plants13212978
  87. 87. Guo Y, Hao F, Zhang X, He Y, Fu YH. Improving Zea mays yield estimation by assimilating UAV-based LAI into WOFOST model. Field Crops Res. 2024;315:109477. https://doi.org/10.1016/j.fcr.2024.109477
  88. 88. Urquizo J, Ccopi D, Ortega K, Castañeda I, Patricio S, Passuni J, et al. Estimation of forage biomass in Avena sativa using agronomic variables through UAV multispectral imaging. Remote Sens. 2024;16(19):3720. https://doi.org/10.3390/rs16193720
  89. 89. Hu Z, Fan S, Li Y, Tang Q, Bao L, Zhang S, et al. Estimating stratified biomass in Gossypium hirsutum fields using UAV multispectral remote sensing and machine learning. Drones. 2025;9(3):186. https://doi.org/10.3390/drones9030186
  90. 90. Liao M, Wang Y, Chu N, Li S, Zhang Y, Lin D. Mature Oryza sativa biomass estimation using UAV-derived RGB vegetation indices and growth parameters. Sensors. 2025;25(9):2798. https://doi.org/10.3390/s25092798
  91. 91. Gašparović M, Zrinjski M, Barković Đ, Radočaj D. An automatic method for weed mapping in Avena sativa fields based on UAV imagery. Comput Electron Agric. 2020;173:105385. https://doi.org/10.1016/j.compag.2020.105385
  92. 92. Torres-Sánchez J, Mesas-Carrascosa FJ, Jiménez-Brenes FM, de Castro AI, López-Granados F. Early detection of broad-leaved and grass weeds in wide row crops using artificial neural networks and UAV imagery. Agronomy. 2021;11(4):749. https://doi.org/10.3390/agronomy11040749
  93. 93. Genze N, Ajekwe R, Güreli Z, Haselbeck F, Grieb M, Grimm DG. Deep learning-based early weed segmentation using motion blurred UAV images of Sorghum fields. Comput Electron Agric. 2022;202:107388. https://doi.org/10.1016/j.compag.2022.107388
  94. 94. Gallo I, Rehman AU, Dehkordi RH, Landro N, La Grassa R, Boschetti M. Deep object detection of crop weeds: performance of YOLOv7 on a real case dataset from UAV images. Remote Sens. 2023;15(2):539. https://doi.org/10.3390/rs15020539
  95. 95. Lin J, Zhang X, Qin Y, Yang S, Wen X, Cernava T, et al. FG-UNet: fine-grained feature-guided UNet for segmentation of weeds and crops in UAV images. Pest Manag Sci. 2025;81(2):856-66. https://doi.org/10.1002/ps.8489
  96. 96. Mesías-Ruiz GA, Borra-Serrano I, Peña J, de Castro AI, Fernández-Quintanilla C, Dorado J. Weed species classification with UAV imagery and standard CNN models: assessing the frontiers of training and inference phases. Crop Prot. 2024;182:106721. https://doi.org/10.1016/j.cropro.2024.106721
  97. 97. Mesías-Ruiz GA, Peña JM, de Castro AI, Borra-Serrano I, Dorado J. Cognitive computing advancements: improving precision crop protection through UAV imagery for targeted weed monitoring. Remote Sens. 2024;16(16):3026. https://doi.org/10.3390/rs16163026
  98. 98. Singh V, Singh D, Kumar H. Efficient application of deep neural networks for identifying small and multiple weed patches using drone images. IEEE Access. 2024;12:71982-96. https://doi.org/10.1109/ACCESS.2024.3402213
  99. 99. López-Granados F, Torres-Sánchez J, Serrano-Pérez A, de Castro AI, Mesas-Carrascosa FJ, Peña JM. Early season weed mapping in Helianthus annuus using UAV technology: variability of herbicide treatment maps against weed thresholds. Precis Agric. 2016;17:183-99. https://doi.org/10.1007/s11119-015-9415-8
  100. 100. Gao J, Nuyttens D, Lootens P, He Y, Pieters JG. Recognising weeds in a Zea mays crop using a random forest machine-learning algorithm and near-infrared snapshot mosaic hyperspectral imagery. Biosyst Eng. 2018;170:39-50. https://doi.org/10.1016/j.biosystemseng.2018.03.006

Downloads

Download data is not yet available.