Home > Research > Publications & Outputs > Plant leaf position estimation with computer vi...

Links

Text available via DOI:

View graph of relations

Plant leaf position estimation with computer vision

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published
Article number5933
<mark>Journal publication date</mark>20/10/2020
<mark>Journal</mark>Sensors
Issue number20
Volume20
Number of pages16
Publication StatusPublished
<mark>Original language</mark>English

Abstract

Autonomous analysis of plants, such as for phenotyping and health monitoring etc., often requires the reliable identification and localization of single leaves, a task complicated by their complex and variable shape. Robotic sensor platforms commonly use depth sensors that rely on either infrared light or ultrasound, in addition to imaging. However, infrared methods have the disadvantage of being affected by the presence of ambient light, and ultrasound methods generally have too wide a field of view, making them ineffective for measuring complex and intricate structures. Alternatives may include stereoscopic or structured light scanners, but these can be costly and overly complex to implement. This article presents a fully computer-vision based solution capable of estimating the three-dimensional location of all leaves of a subject plant with the use of a single digital camera autonomously positioned by a three-axis linear robot. A custom trained neural network was used to classify leaves captured in multiple images taken of a subject plant. Parallax calculations were applied to predict leaf depth, and from this, the three-dimensional position. This article demonstrates proof of concept of the method, and initial tests with positioned leaves suggest an expected error of 20 mm. Future modifications are identified to further improve accuracy and utility across different plant canopies.