In:
PLOS ONE, Public Library of Science (PLoS), Vol. 16, No. 10 ( 2021-10-27), p. e0259036-
Abstract:
The color of particular parts of a flower is often employed as one of the features to differentiate between flower types. Thus, color is also used in flower-image classification. Color labels, such as ‘green’, ‘red’, and ‘yellow’, are used by taxonomists and lay people alike to describe the color of plants. Flower image datasets usually only consist of images and do not contain flower descriptions. In this research, we have built a flower-image dataset, especially regarding orchid species, which consists of human-friendly textual descriptions of features of specific flowers, on the one hand, and digital photographs indicating how a flower looks like, on the other hand. Using this dataset, a new automated color detection model was developed. It is the first research of its kind using color labels and deep learning for color detection in flower recognition. As deep learning often excels in pattern recognition in digital images, we applied transfer learning with various amounts of unfreezing of layers with five different neural network architectures (VGG16, Inception, Resnet50, Xception, Nasnet) to determine which architecture and which scheme of transfer learning performs best. In addition, various color scheme scenarios were tested, including the use of primary and secondary color together, and, in addition, the effectiveness of dealing with multi-class classification using multi-class, combined binary, and, finally, ensemble classifiers were studied. The best overall performance was achieved by the ensemble classifier. The results show that the proposed method can detect the color of flower and labellum very well without having to perform image segmentation. The result of this study can act as a foundation for the development of an image-based plant recognition system that is able to offer an explanation of a provided classification.
Type of Medium:
Online Resource
ISSN:
1932-6203
DOI:
10.1371/journal.pone.0259036
DOI:
10.1371/journal.pone.0259036.g001
DOI:
10.1371/journal.pone.0259036.g002
DOI:
10.1371/journal.pone.0259036.g003
DOI:
10.1371/journal.pone.0259036.g004
DOI:
10.1371/journal.pone.0259036.g005
DOI:
10.1371/journal.pone.0259036.g006
DOI:
10.1371/journal.pone.0259036.g007
DOI:
10.1371/journal.pone.0259036.g008
DOI:
10.1371/journal.pone.0259036.g009
DOI:
10.1371/journal.pone.0259036.g010
DOI:
10.1371/journal.pone.0259036.g011
DOI:
10.1371/journal.pone.0259036.g012
DOI:
10.1371/journal.pone.0259036.g013
DOI:
10.1371/journal.pone.0259036.g014
DOI:
10.1371/journal.pone.0259036.g015
DOI:
10.1371/journal.pone.0259036.t001
DOI:
10.1371/journal.pone.0259036.t002
DOI:
10.1371/journal.pone.0259036.t003
DOI:
10.1371/journal.pone.0259036.t004
DOI:
10.1371/journal.pone.0259036.t005
DOI:
10.1371/journal.pone.0259036.t006
DOI:
10.1371/journal.pone.0259036.t007
DOI:
10.1371/journal.pone.0259036.t008
DOI:
10.1371/journal.pone.0259036.t009
DOI:
10.1371/journal.pone.0259036.t010
DOI:
10.1371/journal.pone.0259036.s001
Language:
English
Publisher:
Public Library of Science (PLoS)
Publication Date:
2021
detail.hit.zdb_id:
2267670-3
Permalink