Human 3D Pose Estimation with a Tilting Camera for Social Mobile Robot Interaction.

Sensors (Basel)

Machine Perception and Intelligent Robotics Group (MAPIR), Dept. of System Engineering and Automation Biomedical Research Institute of Malaga (IBIMA), University of Malaga, 29071 Málaga, Spain.

Published: November 2019

Human-Robot interaction represents a cornerstone of mobile robotics, especially within the field of social robots. In this context, user localization becomes of crucial importance for the interaction. This work investigates the capabilities of wide field-of-view RGB cameras to estimate the 3D position and orientation (i.e., the ) of a user in the environment. For that, we employ a social robot endowed with a fish-eye camera hosted in a tilting head and develop two complementary approaches: (1) a fast method relying on a single image that estimates the user pose from the detection of their feet and does not require either the robot or the user to remain static during the reconstruction; and (2) a method that takes some views of the scene while the camera is being tilted and does not need the feet to be visible. Due to the particular setup of the tilting camera, special equations for 3D reconstruction have been developed. In both approaches, a CNN-based skeleton detector (OpenPose) is employed to identify humans within the image. A set of experiments with real data validate our two proposed methods, yielding similar results than commercial RGB-D cameras while surpassing them in terms of coverage of the scene (wider FoV and longer range) and robustness to light conditions.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6891307PMC
http://dx.doi.org/10.3390/s19224943DOI Listing

Publication Analysis

Top Keywords

tilting camera
8
human pose
4
pose estimation
4
estimation tilting
4
camera
4
camera social
4
social mobile
4
mobile robot
4
robot interaction
4
interaction human-robot
4

Similar Publications

The failure of locked-segment landslides is associated with the destruction of locked segments that exhibit an energy accumulation effect. Thus, understanding their failure mode and instability mechanism for landslide hazard prevention and control is critical. In this paper, multiple instruments, such as tilt sensors, pore water pressure gauges, moisture sensors, matrix suction sensors, resistance strain gauges, miniature earth pressure sensors, a three-dimensional (3D) laser scanner, and a camera, were used to conduct the physical model tests on the rainfall-induced arch locked-segment landslide to analyze the resulting tilting deformation and evolution mechanism.

View Article and Find Full Text PDF

To achieve high-precision 3D reconstruction, a comprehensive improvement has been made to the binocular structured light calibration method. During the calibration process, the calibration object's imaging quality and the camera parameters' nonlinear optimization effect directly affect the caibration accuracy. Firstly, to address the issue of poor imaging quality of the calibration object under tilted conditions, a pixel-level adaptive fill light method was designed using the programmable light intensity feature of the structured light projector, allowing the calibration object to receive uniform lighting and thus improve the quality of the captured images.

View Article and Find Full Text PDF

Oblique plane microcopy (OPM), a variant of light-sheet fluorescence microscopy (LSFM), enables rapid volumetric imaging without mechanically scanning the sample or an objective. In an OPM, the sample space is mapped to a distortion free image space via remote focusing, and the oblique light-sheet plane is mapped via a tilted tertiary imaging system onto a camera. As a result, the 3D point-spread function and optical transfer function are tilted to the optical axis of the tertiary imaging system.

View Article and Find Full Text PDF

YUTO MMS: A comprehensive SLAM dataset for urban mobile mapping with tilted LiDAR and panoramic camera integration.

Int J Rob Res

January 2025

Department of Earth and Space Science and Engineering, Lassonde School of Engineering, York University, Toronto, ON, Canada.

The York University Teledyne Optech (YUTO) Mobile Mapping System (MMS) Dataset, encompassing four sequences totaling 20.1 km, was thoroughly assembled through two data collection expeditions on August 12, 2020, and June 21, 2019. Acquisitions were performed using a uniquely equipped vehicle, fortified with a panoramic camera, a tilted LiDAR, a Global Positioning System (GPS), and an Inertial Measurement Unit (IMU), journeying through two strategic locations: the York University Keele Campus in Toronto and the Teledyne Optech headquarters in City of Vaughan, Canada.

View Article and Find Full Text PDF

Embedded CPU-GPU pupil tracking.

Biomed Opt Express

December 2024

Department of Ophthalmology, Stanford University, Palo Alto, CA 94303, USA.

We explore camera-based pupil tracking using high-level programming in computing platforms with end-user discrete and integrated central processing units (CPUs) and graphics processing units (GPUs), seeking low calculation latencies previously achieved with specialized hardware and programming (Kowalski et al., [Biomed. Opt.

View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!