Agricultural sensor detects crops by their vibrations, offering an alternative approach for farm robots

New robotic agricultural sensor could revolutionize farming
Credit: Carnegie Mellon University

Farmers might be able to get help tending and harvesting crops using a new sensing technology from Carnegie Mellon University's Robotics Institute (RI). Researchers have invented a tool called SonicBoom that can find crops like apples based on the sound they make. The novel technology, still in the early stages of development, may someday be used by farm robots for tasks like pruning vines or locating ripe apples hidden among the leaves.

"Even without a camera, this sensing technology could determine the 3D shape of things just by touching," said Moonyoung (Mark) Lee, a fifth-year Ph.D. student in robotics.

A paper describing this technology appears in IEEE Robotics and Automation Letters.

The device might be the answer to a manipulation problem that has long befuddled agricultural robotics researchers. Farm workers can simply thrust their hands through the leaves toward what looks like an apple and use their to grasp the fruit. But robots depend solely on cameras to guide their arms and manipulators, said Lee.

"One of the reasons manipulation in an agricultural setting is so difficult is because you have so much clutter—leaves hanging everywhere—and that blocks a lot of visual inputs," Lee said. In an orchard, "the fruit itself can be partially occluded and the path the arm must take to reach it can be very occluded."

Credit: IEEE Robotics and Automation Letters (2025). DOI: 10.1109/LRA.2025.3576067

More durable, cheaper technology

SonicBoom solves a problem that existing farming robots face—delicate and cumbersome sensors. Tiny, camera-based tactile sensors, encased in protective gel, can quickly wear out or suffer damage when in frequent contact with plants. Pressure sensors, another current option, have to be applied to large areas of the robot arm, making the approach impractically expensive.

By contrast, SonicBoom relies on contact microphones, which sense audio vibrations when they are in contact with an object rather than through the air like a conventional .

Contact microphones aren't top-of-mind for most robotics researchers, Lee said, but his advisor, RI Associate Professor Oliver Kroemer, used the devices to perform classification tasks, such as identifying the properties of materials.

Credit: IEEE Robotics and Automation Letters (2025). DOI: 10.1109/LRA.2025.3576067

How it works

The research team used an array of six contact microphones placed inside a piece of PVC pipe. When the pipe touches an object, such as a tree branch, the microphones detect the resulting vibration. By analyzing the differences in the , the researchers were able to triangulate where the contact took place. SonicBoom can localize contacts with a precision between 0.43 and 2.2 centimeters.

The PVC pipe protects the contact microphones from damage. It also gives the appearance of a microphone boom, inspiring the name SonicBoom. Ultimately, the microphones could be installed inside a arm.

The researchers used a data-driven machine learning module to develop the ability to map the signals from the microphones. To do so, they collected audio data from 18,000 contacts between the sensor and a wooden rod.

Using the , SonicBoom determines the location of hard or rigid objects. Changing its configuration should enable it to also sense less rigid objects, such as soft fruits and vegetables, Lee said. He has also led subsequent research, posted to the arXiv preprint server, that explores the arrays' ability to identify the object, not just its location.

Though SonicBoom was developed for agricultural use, Lee can imagine it in other applications, such as safety devices when robots are used near people or in robots explicitly designed to interact with humans. It could also be used for applications in dark places.

In addition to Lee and Kroemer, the research team included Ph.D. student Uksang Yoo and RI faculty members Jean Oh, Jeffrey Ichnowski and George Kantor.

More information: Moonyoung Lee et al, SonicBoom: Contact Localization Using Array of Microphones, IEEE Robotics and Automation Letters (2025). DOI: 10.1109/LRA.2025.3576067

Ryan Spears et al, Audio-Visual Contact Classification for Tree Structures in Agriculture, arXiv (2025). DOI: 10.48550/arxiv.2505.12665

Citation: Agricultural sensor detects crops by their vibrations, offering an alternative approach for farm robots (2025, August 14) retrieved 14 August 2025 from https://phys.org/news/2025-08-agricultural-sensor-crops-vibrations-alternative.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Using contact microphones as tactile sensors for robot manipulation

18 shares

Feedback to editors