Background: Robotic-assisted surgery allows surgeons to perform many types of complex operations with greater precision than is possible with conventional surgery. Despite these advantages, in current systems, a surgeon should communicate with the device directly and manually. To allow the robot to adjust parameters such as camera position, the system needs to know automatically what task the surgeon is performing.

Methods: A distance-based time series classification framework has been developed which measures dynamic time warping distance between temporal trajectory data of robot arms and classifies surgical tasks and gestures using a k-nearest neighbor algorithm.

Results: Results on real robotic surgery data show that the proposed framework outperformed state-of-the-art methods by up to 9% across three tasks and by 8% across gestures.

Conclusion: The proposed framework is robust and accurate. Therefore, it can be used to develop adaptive control systems that will be more responsive to surgeons' needs by identifying next movements of the surgeon. Copyright © 2016 John Wiley & Sons, Ltd.

Download full-text PDF

Source
http://dx.doi.org/10.1002/rcs.1766DOI Listing

Publication Analysis

Top Keywords

distance-based time
8
time series
8
series classification
8
proposed framework
8
classification approach
4
approach task
4
task recognition
4
recognition application
4
application surgical
4
surgical robot
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!