This paper presents a deep learning-based approach for quantitatively analyzing gait characteristics, particularly rotational motion, in patients with Parkinson's disease. To overcome the limitations of existing clinical assessment tools, we continuously monitor patients' daily rotational motion using video data captured in a home-like environment. Using Fastpose and the Strided Transformer model, we extract 3D skeletal information and automatically quantify the rotational angles of the hip and knee joints by calculating them. We validate the method using the Turn-REMAP and Turn-H3.6M datasets, achieving an accuracy of 41.6%, a MAE of 34.7 degrees, and a weighted precision of 68.3%. This is the first study to quantify rotational motion in patients with Parkinson's disease in a home environment using data from a single monocular camera.