A Visual Measurement Method for Large-Sized Parts

Aiming at the problems of inconsistent standards and low efficiency of manual measurement of oversized parts, a machine vision-based measurement method for battery box parts is proposed for the measurement scenario of battery box parts in new energy vehicles. The method utilizes the edge detection p...

Full description

Saved in:
Bibliographic Details
Main Author: Junkai Yang
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10767705/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Aiming at the problems of inconsistent standards and low efficiency of manual measurement of oversized parts, a machine vision-based measurement method for battery box parts is proposed for the measurement scenario of battery box parts in new energy vehicles. The method utilizes the edge detection pixel point data of the part image for searching and region localization of line and circle features in the image with a modified Hough transform; A geometric calculation is used to compute the coordinates of the point cloud within the feature area; Drawing on the idea of graph neural network, the priori knowledge of machining is utilized to establish the correlation of related dimensional features in the part drawing, and this is used to propose a model of dimensional validation and correction by the combination of different features. On this basis, an automatic acquisition platform of battery box part images is built, part image samples are collected, and test experiments and algorithm comparisons are conducted in this way. The experimental results show that the machine vision-based battery box part dimensions measurement method proposed in this paper improves the current measurement efficiency by more than 50 times, the measurement error meets the engineering requirements, and the measurement accuracy is also significantly improved compared with other methods, and has good robustness.
ISSN:2169-3536