RGB-D sensors can collect postural data in an automatized way. However, the application of these devices in real work environments requires overcoming problems such as lack of accuracy or body parts' occlusion. This work presents the use of RGB-D sensors and genetic algorithms for the optimization of workstation layouts. RGB-D sensors are used to capture workers' movements when they reach objects on workbenches. Collected data are then used to optimize workstation layout by means of genetic algorithms considering multiple ergonomic criteria. Results show that typical drawbacks of using RGB-D sensors for body tracking are not a problem for this application, and that the combination with intelligent algorithms can automatize the layout design process. The procedure described can be used to automatically suggest new layouts when workers or processes of production change, to adapt layouts to specific workers based on their ways to do the tasks, or to obtain layouts simultaneously optimized for several production processes.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.apergo.2017.01.012 | DOI Listing |
Annu Int Conf IEEE Eng Med Biol Soc
July 2024
We introduce a novel integration of vision sensors in the form of a dual sensor system. The dual sensor system combines three distinct units: an optical localizer and two RGB-D sensors. This system is uniquely capable of concurrently providing real-time 3D shape information for deformable objects and accurate pose tracking data for medical instruments.
View Article and Find Full Text PDFSensors (Basel)
February 2025
Department of Control and Information Systems, FEIT, University of Zilina, 010 26 Zilina, Slovakia.
This paper presents the implementation of ORB-SLAM3 for visual odometry on a low-power ARM-based system, specifically the Jetson Nano, to track a robot's movement using RGB-D cameras. Key challenges addressed include the selection of compatible software libraries, camera calibration, and system optimization. The ORB-SLAM3 algorithm was adapted for the ARM architecture and tested using both the EuRoC dataset and real-world scenarios involving a mobile robot.
View Article and Find Full Text PDFSensors (Basel)
February 2025
Data61, Commonwealth Scientific and Industrial Research Organisation (CSIRO), Canberra, ACT 2601, Australia.
High-resolution RGB-D sensors are widely used in computer vision, manufacturing, and robotics. The depth maps from these sensors have inherently high measurement uncertainty that includes both systematic and non-systematic noise. These noisy depth estimates degrade the quality of scans, resulting in less accurate 3D reconstruction, making them unsuitable for some high-precision applications.
View Article and Find Full Text PDFSensors (Basel)
January 2025
College of Electrical Engineering, Zhejiang University, Hangzhou 310027, China.
Navigating crowded environments poses significant challenges for mobile robots, particularly as traditional Simultaneous Localization and Mapping (SLAM)-based methods often struggle with dynamic and unpredictable settings. This paper proposes a visual target-driven navigation method using self-attention enhanced deep reinforcement learning (DRL) to overcome these limitations. The navigation policy is developed based on the Twin-Delayed Deep Deterministic Policy Gradient (TD3) algorithm, enabling efficient obstacle avoidance and target pursuit.
View Article and Find Full Text PDFData Brief
April 2025
Brubotics, Vrije Universiteit Brussel, Brussels, Belgium.
3D pose estimation and gesture command recognition are crucial for ensuring safety and improving human-robot interaction. While RGB-D cameras are commonly used for these tasks, they often raise privacy concerns due to their ability to capture detailed visual data of human operators. In contrast, using RaDAR sensors offers a privacy-preserving alternative, as they can output point-cloud data rather than images.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!