Research on Skeleton Data Compensation of Gymnastics basedon Dynamic and Static Two-dimensional Regression using Kinect
DOI:
https://doi.org/10.2478/msr-2022-0036Keywords:
Azure Kinect, motion capture, motion tracking, motion compensationAbstract
The intelligent training and assessment of gymnastics movements require studying motion trajectory and reconstructing thecharacter animation. Microsoft Kinect has been widely used due to its advantages of low price and high frame rate. However, its opticalcharacteristics are inevitably affected by illumination and occlusion. It is necessary to reduce data noise via specific algorithms. Most of theexisting research focuses on local motion but lacks consideration of the whole human skeleton. Based on the analysis of the spatialcharacteristics of gymnastics and the movement principle of the human body, this paper proposes a dynamic and static two-dimensionalregression compensation algorithm. Firstly, the constraint characteristics of human skeleton motion were analyzed, and the maximumconstraint table and Mesh Collider were established. Then, the dynamic acceleration of skeleton motion and the spatial characteristics ofstatic limb motion were calculated based on the data of adjacent effective skeleton frames before and after the collision. Finally, using theleast squares polynomial fitting to compensate and correct the lost skeleton coordinate data, it realizes the smoothness and rationality ofhuman skeleton animation. The results of two experiments showed that the solution of the skeleton point solved the problem caused by dataloss due to the Kinect optical occlusion. The data compensation time of an effective block skeleton point can reach 180 ms, with an averageerror of about 0.1 mm, which shows a better data compensation effect of motion data acquisition and animation reconstruction.
Downloads
Published
How to Cite
Issue
Section
Categories
License
Copyright (c) 2022 Slovak Academy of Sciences - Institute of Measurement Science
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.