Home > Research > Publications & Outputs > Plant leaf position estimation with computer vi...

Links

Text available via DOI:

View graph of relations

Plant leaf position estimation with computer vision

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Plant leaf position estimation with computer vision. / Beadle, James; Taylor, C. James; Ashworth, Kirsti; Cheneler, David.

In: Sensors, Vol. 20, No. 20, 5933, 20.10.2020.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

APA

Vancouver

Author

Bibtex

@article{14265436d2124509b230556dd8f2d437,
title = "Plant leaf position estimation with computer vision",
abstract = "Autonomous analysis of plants, such as for phenotyping and health monitoring etc., often requires the reliable identification and localization of single leaves, a task complicated by their complex and variable shape. Robotic sensor platforms commonly use depth sensors that rely on either infrared light or ultrasound, in addition to imaging. However, infrared methods have the disadvantage of being affected by the presence of ambient light, and ultrasound methods generally have too wide a field of view, making them ineffective for measuring complex and intricate structures. Alternatives may include stereoscopic or structured light scanners, but these can be costly and overly complex to implement. This article presents a fully computer-vision based solution capable of estimating the three-dimensional location of all leaves of a subject plant with the use of a single digital camera autonomously positioned by a three-axis linear robot. A custom trained neural network was used to classify leaves captured in multiple images taken of a subject plant. Parallax calculations were applied to predict leaf depth, and from this, the three-dimensional position. This article demonstrates proof of concept of the method, and initial tests with positioned leaves suggest an expected error of 20 mm. Future modifications are identified to further improve accuracy and utility across different plant canopies.",
keywords = "neural network, computer vision, depth estimation, position estimation, parallax",
author = "James Beadle and Taylor, {C. James} and Kirsti Ashworth and David Cheneler",
year = "2020",
month = oct,
day = "20",
doi = "10.3390/s20205933",
language = "English",
volume = "20",
journal = "Sensors",
issn = "1424-8220",
publisher = "Multidisciplinary Digital Publishing Institute (MDPI)",
number = "20",

}

RIS

TY - JOUR

T1 - Plant leaf position estimation with computer vision

AU - Beadle, James

AU - Taylor, C. James

AU - Ashworth, Kirsti

AU - Cheneler, David

PY - 2020/10/20

Y1 - 2020/10/20

N2 - Autonomous analysis of plants, such as for phenotyping and health monitoring etc., often requires the reliable identification and localization of single leaves, a task complicated by their complex and variable shape. Robotic sensor platforms commonly use depth sensors that rely on either infrared light or ultrasound, in addition to imaging. However, infrared methods have the disadvantage of being affected by the presence of ambient light, and ultrasound methods generally have too wide a field of view, making them ineffective for measuring complex and intricate structures. Alternatives may include stereoscopic or structured light scanners, but these can be costly and overly complex to implement. This article presents a fully computer-vision based solution capable of estimating the three-dimensional location of all leaves of a subject plant with the use of a single digital camera autonomously positioned by a three-axis linear robot. A custom trained neural network was used to classify leaves captured in multiple images taken of a subject plant. Parallax calculations were applied to predict leaf depth, and from this, the three-dimensional position. This article demonstrates proof of concept of the method, and initial tests with positioned leaves suggest an expected error of 20 mm. Future modifications are identified to further improve accuracy and utility across different plant canopies.

AB - Autonomous analysis of plants, such as for phenotyping and health monitoring etc., often requires the reliable identification and localization of single leaves, a task complicated by their complex and variable shape. Robotic sensor platforms commonly use depth sensors that rely on either infrared light or ultrasound, in addition to imaging. However, infrared methods have the disadvantage of being affected by the presence of ambient light, and ultrasound methods generally have too wide a field of view, making them ineffective for measuring complex and intricate structures. Alternatives may include stereoscopic or structured light scanners, but these can be costly and overly complex to implement. This article presents a fully computer-vision based solution capable of estimating the three-dimensional location of all leaves of a subject plant with the use of a single digital camera autonomously positioned by a three-axis linear robot. A custom trained neural network was used to classify leaves captured in multiple images taken of a subject plant. Parallax calculations were applied to predict leaf depth, and from this, the three-dimensional position. This article demonstrates proof of concept of the method, and initial tests with positioned leaves suggest an expected error of 20 mm. Future modifications are identified to further improve accuracy and utility across different plant canopies.

KW - neural network

KW - computer vision

KW - depth estimation

KW - position estimation

KW - parallax

U2 - 10.3390/s20205933

DO - 10.3390/s20205933

M3 - Journal article

VL - 20

JO - Sensors

JF - Sensors

SN - 1424-8220

IS - 20

M1 - 5933

ER -