This paper conducts a systematic review of the integration of biosensors and multimodal learning analytics (MmLA) for analyzing and predicting learner behavior during computer-based learning sessions. By analyzing 54 primary studies, we examine how physiological signals, such as heart rate, brain activity, and eye tracking, can be combined with traditional interaction data and self-reports to gain deeper insights into cognitive states and engagement levels. We analyze commonly used methodologies, such as advanced machine learning algorithms and multimodal data preprocessing techniques, highlight current research trends, limitations, and emerging directions, and highlight the transformative potential of biosensor-based adaptive learning systems. We suggest that multimodal data integration can facilitate personalized learning experiences, real-time feedback, and intelligent educational interventions, leading to more personalized and adaptive online learning experiences.