The hand-eye calibration of laser profilers and industrial robots is a critical component of the laser vision system in welding applications. To improve calibration accuracy and efficiency, this study proposes a position-constrained calibration compensation algorithm aimed at optimizing the hand-eye transformation matrix. Initially, the laser profiler is mounted on the robot and used to scan a standard sphere from various poses to obtain the theoretical center coordinates of the sphere, which are then utilized to compute the hand-eye transformation matrix. Subsequently, the positional data of the standard sphere's surface are collected at different poses using the welding gun tip mounted on the robot, allowing for the fitting of the sphere's center coordinates as calibration values. Finally, by minimizing the error between the theoretical and calibrated sphere center coordinates, the optimal hand-eye transformation matrix is derived. Experimental results demonstrate that, following error compensation, the average distance error in hand-eye calibration decreased from 4.5731 mm to 0.7069 mm, indicating that the proposed calibration method is both reliable and effective.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.3390/s24237554 | DOI Listing |
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11644738 | PMC |
Sensors (Basel)
November 2024
Department of Mechanical Engineering, Tsinghua University, Beijing 100084, China.
The hand-eye calibration of laser profilers and industrial robots is a critical component of the laser vision system in welding applications. To improve calibration accuracy and efficiency, this study proposes a position-constrained calibration compensation algorithm aimed at optimizing the hand-eye transformation matrix. Initially, the laser profiler is mounted on the robot and used to scan a standard sphere from various poses to obtain the theoretical center coordinates of the sphere, which are then utilized to compute the hand-eye transformation matrix.
View Article and Find Full Text PDFComput Methods Programs Biomed
February 2025
Surgical Performance Enhancement and Robotics (SuPER) Centre, Department of Surgery, McGill University, 1650 Cedar Avenue, Montreal QC H3G 1A4, Canada. Electronic address:
Background: Mixed-reality-assisted surgery has become increasingly prominent, offering real-time 3D visualization of target anatomy such as tumors. These systems facilitate translating preoperative 3D surgical plans to the patient's body intraoperatively and allow for interactive modifications based on the patient's real-time conditions. However, achieving sub-millimetre accuracy in mixed-reality (MR) visualization and interaction is crucial to mitigate device-related risks and enhance surgical precision.
View Article and Find Full Text PDFJ Dent
November 2024
Department of Oral Surgery, School of Medicine, Dentistry and Nursing, College of Medical, Veterinary and Life Sciences, University of Glasgow, Glasgow, United Kingdom.
Objectives: To assess the feasibility and accuracy of a new prototype robotic implant system for the placement of zygomatic implants in edentulous maxillary models.
Methods: The study was carried out on eight plastic models. Cone beam computed tomographs were captured for each model to plan the positions of zygomatic implants.
Sensors (Basel)
September 2024
Shenyang Institute of Automation, Chinese Academy of Sciences, Nanta Street 114, Shenyang 110016, China.
Int J Comput Assist Radiol Surg
September 2024
School of Biomedical Engineering, Western University, London, Ontario, Canada.
Purpose: Optical-see-through head-mounted displays have the ability to seamlessly integrate virtual content with the real world through a transparent lens and an optical combiner. Although their potential for use in surgical settings has been explored, their clinical translation is sparse in the current literature, largely due to their limited tracking capabilities and the need for manual alignment of virtual representations of objects with their real-world counterparts.
Methods: We propose a simple and robust hand-eye calibration process for the depth camera of the Microsoft HoloLens 2, utilizing a tracked surgical stylus fitted with infrared reflective spheres as the calibration tool.
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!