Real-Time Landmark-Based Face Analysis for Expression and Gender Classification
Main Article Content
Abstract
This research developed a web-based real-time facial analysis system to overcome the challenge of detection accuracy in dynamic video streaming data. Using the face-api.js library with the Tiny Face Detector algorithm and a 68-point landmark model, the system is capable of simultaneously detecting faces, classifying gender, and recognizing seven basic emotional expressions. The main innovation of this system lies in the automatic extraction of five Regions of Interest (ROI) eyes, eyebrows, nose, mouth, and jaw and the presentation of confidence score data in the form of a time series graph. All analysis results are stored in a structured JSON dataset format for further research needs. The implementation results show high performance with an average confidence value above 90% in frontal face conditions and optimal lighting. The system has been proven to maintain detection stability up to a 30 degree face tilt angle and process data without significant latency. Although low light intensity can reduce the confidence value by 15-20%, this architecture proves the effectiveness of complex facial analysis using minimal hardware resources.
Article Details
References
Arriaga, O., Valdenegro-Toro, M., & Plöger, P. (2017). Real-time Convolutional Neural Networks for Emotion and Gender Classification. http://arxiv.org/abs/1710.07557
Boudouri, Y. El, & Bohi, A. (2025). EmoNeXt: an Adapted ConvNeXt for Facial Emotion Recognition. https://doi.org/10.1109/MMSP59012.2023.10337732
Bulat, A., & Tzimiropoulos, G. (2017). How far are we from solving the 2D & 3D Face Alignment problem? (and a dataset of 230,000 3D facial landmarks). https://doi.org/10.1109/ICCV.2017.116
Devi, V. S., Ramisetty, U., Ramisetty, K., & Thimmareddy, A. (2025). Real-Time age, gender and emotion detection using YOLOv8. ITM Web of Conferences, 74, 01015. https://doi.org/10.1051/itmconf/20257401015
Farkhod, A., Abdusalomov, A. B., Mukhiddinov, M., & Cho, Y. I. (2022). Development of Real-Time Landmark-Based Emotion Recognition CNN for Masked Faces. Sensors, 22(22). https://doi.org/10.3390/s22228704
Goodfellow, I. J., Erhan, D., Carrier, P. L., Courville, A., Mirza, M., Hamner, B., Cukierski, W., Tang, Y., Thaler, D., Lee, D.-H., Zhou, Y., Ramaiah, C., Feng, F., Li, R., Wang, X., Athanasakis, D., Shawe-Taylor, J., Milakov, M., Park, J., … Bengio, Y. (2013). Challenges in Representation Learning: A report on three machine learning contests. http://arxiv.org/abs/1307.0414
Haghpanah, M. A., Saeedizade, E., Masouleh, M. T., & Kalhor, A. (n.d.). Real-Time Facial Expression Recognition using Facial Landmarks and Neural Networks.
Iskandar, M., Azizah, N., & Jaya, F. (2025). Real-Time Face Age Detection System Based on Deep Neural Networks with MediaPipe Optimization for Enhanced Accuracy. In Jurnal Sistem Cerdas.
Koksal, T. E., & Gumus, A. (2025). Deep Learning-Based Real-Time Sequential Facial Expression Analysis Using Geometric Features. http://arxiv.org/abs/2512.05669
Kopalidis, T., Solachidis, V., Vretos, N., & Daras, P. (2024). Advances in Facial Expression Recognition: A Survey of Methods, Benchmarks, Models, and Datasets. Information (Switzerland), 15(3). https://doi.org/10.3390/info15030135
Lahariya, A., Singh, V., & Tiwary, U. S. (2021). Real-time Emotion and Gender Classification using Ensemble CNN. http://arxiv.org/abs/2111.07746
Li, S., & Deng, W. (2018). Deep Facial Expression Recognition: A Survey. https://doi.org/10.1109/TAFFC.2020.2981446
Makinist, S., & Aydin, G. (2025). Gender Classification Using Face Vectors: A Deep Learning Approach Without Classical Models. Information (Switzerland), 16(7). https://doi.org/10.3390/info16070531
Ojo, O. S., Oyediran, M. O., Awodoye, O. O., Ajagbe, S. A., Awotunde, J. B., Bandyopadhyay, A., & Adigun, M. O. (2024). Real-time Face-based Gender Identification System Using Pelican Support Vector Machine. Procedia Computer Science, 235, 3236–3245. https://doi.org/10.1016/j.procs.2024.04.306
Spectrum of Engineering Sciences ISSN (e) 3007-3138 (p) 3007-312X. (n.d.). https://doi.org/10.5281/zenodo.15681297