A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@remsenmedia.com&api_key=81853a771c3a3a2c6b2553a65bc33b056f08&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3122
Function: getPubMedXML

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

Automatic Scoring of Rhizoctonia Crown and Root Rot Affected Sugar Beet Fields from Orthorectified UAV Images Using Machine Learning. | LitMetric

Rhizoctonia crown and root rot (RCRR), caused by , can cause severe yield and quality losses in sugar beet. The most common strategy to control the disease is the development of resistant varieties. In the breeding process, field experiments with artificial inoculation are carried out to evaluate the performance of genotypes and varieties. The phenotyping process in breeding trials requires constant monitoring and scoring by skilled experts. This work is time demanding and shows bias and heterogeneity according to the experience and capacity of each individual person. Optical sensors and artificial intelligence have demonstrated great potential to achieve higher accuracy than human raters and the possibility to standardize phenotyping applications. A workflow combining red-green-blue and multispectral imagery coupled to an unmanned aerial vehicle (UAV), as well as machine learning techniques, was applied to score diseased plants and plots affected by RCRR. Georeferenced annotation of UAV-orthorectified images was carried out. With the annotated images, five convolutional neural networks were trained to score individual plants. The training was carried out with different image analysis strategies and data augmentation. The custom convolutional neural network trained from scratch together with pretrained MobileNet showed the best precision in scoring RCRR (0.73 to 0.85). The average per plot of spectral information was used to score the plots, and the benefit of adding the information obtained from the score of individual plants was compared. For this purpose, machine learning models were trained together with data management strategies, and the best-performing model was chosen. A combined pipeline of random forest and k-nearest neighbors has shown the best weighted precision (0.67). This research provides a reliable workflow for detecting and scoring RCRR based on aerial imagery. RCRR is often distributed heterogeneously in trial plots; therefore, considering the information from individual plants of the plots showed a significant improvement in UAV-based automated monitoring routines.

Download full-text PDF

Source
http://dx.doi.org/10.1094/PDIS-04-23-0779-REDOI Listing

Publication Analysis

Top Keywords

machine learning
12
individual plants
12
rhizoctonia crown
8
crown root
8
root rot
8
sugar beet
8
plants plots
8
convolutional neural
8
score individual
8
scoring rcrr
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!