GLORIA

GEOMAR Library Ocean Research Information Access

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • MDPI AG  (3)
  • Dai, Xiufeng  (3)
Material
Publisher
  • MDPI AG  (3)
Person/Organisation
Language
Years
  • 1
    Online Resource
    Online Resource
    MDPI AG ; 2022
    In:  Sensors Vol. 22, No. 13 ( 2022-06-25), p. 4819-
    In: Sensors, MDPI AG, Vol. 22, No. 13 ( 2022-06-25), p. 4819-
    Abstract: Conventional mobile robots employ LIDAR for indoor global positioning and navigation, thus having strict requirements for the ground environment. Under the complicated ground conditions in the greenhouse, the accumulative error of odometer (ODOM) that arises from wheel slip is easy to occur during the long-time operation of the robot, which decreases the accuracy of robot positioning and mapping. To solve the above problem, an integrated positioning system based on UWB (ultra-wideband)/IMU (inertial measurement unit)/ODOM/LIDAR is proposed. First, UWB/IMU/ODOM is integrated by the Extended Kalman Filter (EKF) algorithm to obtain the estimated positioning information. Second, LIDAR is integrated with the established two-dimensional (2D) map by the Adaptive Monte Carlo Localization (AMCL) algorithm to achieve the global positioning of the robot. As indicated by the experiments, the integrated positioning system based on UWB/IMU/ODOM/LIDAR effectively reduced the positioning accumulative error of the robot in the greenhouse environment. At the three moving speeds, including 0.3 m/s, 0.5 m/s, and 0.7 m/s, the maximum lateral error is lower than 0.1 m, and the maximum lateral root mean square error (RMSE) reaches 0.04 m. For global positioning, the RMSEs of the x-axis direction, the y-axis direction, and the overall positioning are estimated as 0.092, 0.069, and 0.079 m, respectively, and the average positioning time of the system is obtained as 72.1 ms. This was sufficient for robot operation in greenhouse situations that need precise positioning and navigation.
    Type of Medium: Online Resource
    ISSN: 1424-8220
    Language: English
    Publisher: MDPI AG
    Publication Date: 2022
    detail.hit.zdb_id: 2052857-7
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 2
    Online Resource
    Online Resource
    MDPI AG ; 2022
    In:  Agriculture Vol. 12, No. 12 ( 2022-11-28), p. 2039-
    In: Agriculture, MDPI AG, Vol. 12, No. 12 ( 2022-11-28), p. 2039-
    Abstract: Currently, pineapple processing is a primarily manual task, with high labor costs and low operational efficiency. The ability to precisely detect and locate pineapple eyes is critical to achieving automated pineapple eye removal. In this paper, machine vision and automatic control technology are used to build a pineapple eye recognition and positioning test platform, using the YOLOv5l target detection algorithm to quickly identify pineapple eye images. A 3D localization algorithm based on multiangle image matching is used to obtain the 3D position information of pineapple eyes, and the CNC precision motion system is used to pierce the probe into each pineapple eye to verify the effect of the recognition and positioning algorithm. The recognition experimental results demonstrate that the mAP reached 98%, and the average time required to detect one pineapple eye image was 0.015 s. According to the probe test results, the average deviation between the actual center of the pineapple eye and the penetration position of the probe was 1.01 mm, the maximum was 2.17 mm, and the root mean square value was 1.09 mm, which meets the positioning accuracy requirements in actual pineapple eye-removal operations.
    Type of Medium: Online Resource
    ISSN: 2077-0472
    Language: English
    Publisher: MDPI AG
    Publication Date: 2022
    detail.hit.zdb_id: 2651678-0
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
  • 3
    In: Applied Sciences, MDPI AG, Vol. 12, No. 8 ( 2022-04-10), p. 3810-
    Abstract: Accurate identification of field pests has crucial decision-making significance for integrated pest control. Most current research focuses on the identification of pests on the sticky card or the case of great differences between the target and the background. There is little research on field pest identification with protective color characteristics. Aiming at the problem that it is difficult to identify pests with protective color characteristics in the complex field environment, a field pest identification method based on near-infrared imaging technology and YOLOv5 is proposed in this paper. Firstly, an appropriate infrared filter and ring light source have been selected to build an image acquisition system according to the wavelength with the largest spectral reflectance difference between the spectral curves of the pest (Pieris rapae) and its host plants (cabbage), which are formed by specific spectral characteristics. Then, field pest images have been collected to construct a data set, which has been trained and tested through YOLOv5. Experimental results demonstrate that the average time required to detect one pest image is 0.56 s, and the mAP reaches 99.7%.
    Type of Medium: Online Resource
    ISSN: 2076-3417
    Language: English
    Publisher: MDPI AG
    Publication Date: 2022
    detail.hit.zdb_id: 2704225-X
    Location Call Number Limitation Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...