As autonomous virtual avatars and robots are expected to increasingly be deployed in human collective activities (e.g., rehabilitation therapy, sports, and manufacturing), realistic human motion models are essential for designing cognitive architectures and control strategies to drive these agents. In this study, we propose a novel data-driven approach to capture individual human motor behavior characteristics. First, we demonstrate that motion amplitude effectively characterizes individual motor characteristics. We then propose a fully data-driven approach based on long short-term memory (LSTM) neural networks to generate unique motions that capture the unique characteristics of specific individuals. We validate the architecture using real human data and demonstrate that, while state-of-the-art Kuramoto-like models fail to reproduce individual motor characteristics, the proposed model accurately reproduces the velocity distribution and amplitude envelope of trained individuals and distinguishes them from others.