In:
PLOS ONE, Public Library of Science (PLoS), Vol. 18, No. 2 ( 2023-2-15), p. e0273057-
Abstract:
The accurate and rapid detection of cotton seed quality is crucial for safeguarding cotton cultivation. To increase the accuracy and efficiency of cotton seed detection, a deep learning model, which was called the improved ResNet50 (Impro-ResNet50), was used to detect cotton seed quality. First, the convolutional block attention module (CBAM) was embedded into the ResNet50 model to allow the model to learn both the vital channel information and spatial location information of the image, thereby enhancing the model’s feature extraction capability and robustness. The model’s fully connected layer was then modified to accommodate the cotton seed quality detection task. An improved LRelu-Softplus activation function was implemented to facilitate the rapid and straightforward quantification of the model training procedure. Transfer learning and the Adam optimization algorithm were used to train the model to reduce the number of parameters and accelerate the model’s convergence. Finally, 4419 images of cotton seeds were collected for training models under controlled conditions. Experimental results demonstrated that the Impro-ResNet50 model could achieve an average detection accuracy of 97.23% and process a single image in 0.11s. Compared with Squeeze-and-Excitation Networks (SE) and Coordination Attention (CA), the model’s feature extraction capability was superior. At the same time, compared with classical models such as AlexNet, VGG16, GoogLeNet, EfficientNet, and ResNet18, this model had superior detection accuracy and complexity balances. The results indicate that the Impro-ResNet50 model has a high detection accuracy and a short recognition time, which meet the requirements for accurate and rapid detection of cotton seed quality.
Type of Medium:
Online Resource
ISSN:
1932-6203
DOI:
10.1371/journal.pone.0273057
DOI:
10.1371/journal.pone.0273057.g001
DOI:
10.1371/journal.pone.0273057.g002
DOI:
10.1371/journal.pone.0273057.g003
DOI:
10.1371/journal.pone.0273057.g004
DOI:
10.1371/journal.pone.0273057.g005
DOI:
10.1371/journal.pone.0273057.g006
DOI:
10.1371/journal.pone.0273057.g007
DOI:
10.1371/journal.pone.0273057.g008
DOI:
10.1371/journal.pone.0273057.g009
DOI:
10.1371/journal.pone.0273057.g010
DOI:
10.1371/journal.pone.0273057.g011
DOI:
10.1371/journal.pone.0273057.t001
DOI:
10.1371/journal.pone.0273057.t002
DOI:
10.1371/journal.pone.0273057.t003
DOI:
10.1371/journal.pone.0273057.t004
DOI:
10.1371/journal.pone.0273057.s001
DOI:
10.1371/journal.pone.0273057.s002
DOI:
10.1371/journal.pone.0273057.s003
DOI:
10.1371/journal.pone.0273057.s004
DOI:
10.1371/journal.pone.0273057.r001
DOI:
10.1371/journal.pone.0273057.r002
DOI:
10.1371/journal.pone.0273057.r003
DOI:
10.1371/journal.pone.0273057.r004
DOI:
10.1371/journal.pone.0273057.r005
DOI:
10.1371/journal.pone.0273057.r006
Language:
English
Publisher:
Public Library of Science (PLoS)
Publication Date:
2023
detail.hit.zdb_id:
2267670-3
Permalink