New autonomous robotic technology developed by Monash University researchers has the potential to become the ‘apple of my eye’ for Australia’s food industry as it deals with labour shortages and an increased demand for fresh produce.
A research team, led by Dr Chao Chen in Monash University’s Department of Mechanical and Aerospace Engineering, has developed an autonomous harvesting robot capable of identifying, picking and depositing apples in as little as seven seconds at full capacity.
Following extensive trials in February and March at Fankhauser Apples in Drouin, Victoria, the robot was able to harvest more than 85 per cent of all reachable apples in the canopy as identified by its vision system.
Of all apples harvested, less than 6 per cent were damaged due to stem removal. Apples without stems can still be sold, but don’t necessarily fit the cosmetic guidelines of some retailers.
With the robot limited to half its maximum speed, the median harvest rate was 12.6 seconds per apple. In streamlined pick-and-drop scenarios, the cycle time reduced to roughly nine seconds.
By using the robot’s capacity speed, individual apple harvesting time can drop to as little as seven seconds.
“Our developed vision system can not only positively identify apples in a tree within its range in an outdoors orchard environment by means of deep learning, but also identify and categorise obstacles, such as leaves and branches, to calculate the optimum trajectory for apple extraction,” Dr Chen, the Director of Laboratory of Motion Generation and Analysis (LMGA), said.
Automatic harvesting robots, while a promising technology for the agricultural industry, pose challenges for fruit and vegetable growers.
Robotic harvesting of fruit and vegetables require the vision system to detect and localise the produce. To increase the success rate and reduce the damage of produce during the harvesting process, information on the shape, and stem-branch joint location and orientation are also required.
To counter this problem, researchers created a state-of-the-art motion-planning algorithm featuring fast-generation of collision-free trajectories to minimise processing and travel times between apples, reducing harvesting time and maximising the number of apples that can be harvested at a single location.
The robot’s vision system can identify more than 90 per cent of all visible apples seen within the camera’s view from a distance of approximately 1.2m. The system can work in all types of lighting and weather conditions, including intense sunlight and rain, and takes less than 200 milliseconds to process the image of an apple.
“We also implemented a ‘path-planning’ algorithm that was able to generate collision-free trajectories for more than 95 per cent of all reachable apples in the canopy. It takes just eight seconds to plan the entire trajectory for the robot to grasp and deposit an apple,” Dr Chen said.
“The robot grasps apples with a specially designed, pneumatically powered, soft gripper with four independently actuated fingers and suction system that grasps and extracts apples efficiently, while minimising damage to the fruit and the tree itself.
“In addition, the suction system draws the apple from the canopy into the gripper, reducing the need for the gripper to reach into the canopy and potentially damaging its surroundings. The gripper can extract more than 85 per cent of all apples from the canopy that were planned for harvesting.”
Dr Chen said the system can address the challenges of solving the current labour shortage in Australia’s agricultural sector, the future food crisis as population grows and decreased arable land. He said technological advances could also help increase the productivity of fruit and attract younger people to working in farms with this technology.
The research team comprises Dr Chao Chen, Dr Wesley Au, Mr Xing Wang, Mr Hugh Zhou, and Dr Hanwen Kang in LMGA at Monash. The project is funded by the Australian Research Council Industrial Transformation Research Hubs scheme (ARC Nanocomm Hub – IH150100006).