Optimizing plane detection in point clouds through line sampling

Abstract Plane detection in point clouds is a common step in interpreting environments within robotics. Mobile robotic platforms must interact efficiently and safely with their surroundings, which requires capabilities such as detecting walls to avoid collisions and recognizing workbenches for objec...

Full description

Saved in:
Bibliographic Details
Main Authors: José María Martínez-Otzeta, Jon Azpiazu, Iñigo Mendialdua, Basilio Sierra
Format: Article
Language:English
Published: Nature Portfolio 2025-08-01
Series:Scientific Reports
Subjects:
Online Access:https://doi.org/10.1038/s41598-025-12660-w
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Abstract Plane detection in point clouds is a common step in interpreting environments within robotics. Mobile robotic platforms must interact efficiently and safely with their surroundings, which requires capabilities such as detecting walls to avoid collisions and recognizing workbenches for object manipulation. Since these environmental elements typically appear as plane-shaped surfaces, a fast and accurate plane detector is an essential tool for robotics practitioners. RANSAC (Random Sample Consensus) is a widely used technique for plane detection that iteratively evaluates the fitness of planes by sampling three points at a time from a point cloud. In this work, we present an approach that, rather than seeking planes directly, focuses on finding lines by sampling only two points at a time. This leverages the observation that it is more likely to detect lines within the plane than to find the plane itself. To estimate planes from these lines, we perform an additional step that fits a plane for each pair of lines. Experiments conducted on three datasets, two of which are public, demonstrate that our approach outperforms the traditional RANSAC method, achieving better results while requiring fewer iterations. A public repository containing the developed code is also provided.
ISSN:2045-2322