This paper presents a gender classification model using color images of the periocular region, unaffected by factors such as makeup or disguise. The proposed CNN model was evaluated on two eye datasets: CVBL and (Female and Male). It achieved high accuracies of 99% on the CVBL dataset and 96% on the (Female and Male) dataset. This was achieved using a small number of learnable parameters (7,235,089). The model's performance was evaluated using various metrics and compared to existing state-of-the-art techniques, demonstrating its effectiveness and suggesting practical applications in areas such as security and surveillance.