In this paper, we propose DailyLLM, the first activity log generation and summarization system that comprehensively integrates four dimensions of contextual information: location, motion, environment, and physiological information, using only common sensors of smartphones and smartwatches. DailyLLM enables high-dimensional activity understanding by integrating a lightweight LLM-based framework with structured prompting and efficient feature extraction. It is proposed to overcome the limitations of existing methods in accuracy, efficiency, and semantic richness, and achieves 17% improved BERTScore precision and nearly 10x faster inference speed than the state-of-the-art method with 70B parameters using a 1.5B-parameter LLM model.