Biometric painting: Integrating biosensor data into the creative process
Abstract
Art has been a medium of self-expression, evolving with technological advancements. Using physiological signals, biometric painting directly affects the artistic process. By bridging the gap between the artist’s internal emotional state and the visual depiction of the painting, this fusion provides an innovative approach to examining and expressing human emotions. The objective is to investigate biometric painting, integrating biosensor data into the creative process. To expand the creative process of biometric painting by utilizing biosensor data to establish emotion recognition in biometric painting. A biometric painting system was created that used users’ real-time biosensor data to gather visual components that represented their emotional and physical states. The data is preprocessed using a median filter to remove noise from the sensor data. Then, the features are extracted using wavelet transform (WT). The research introduces an Intelligent Remora Optimized Flexible Deep Belief Network (IRO-FDBN) to recognize emotion in biometric painting using biosensor data. The results indicate that the established model outperforms an emotion recognition model. The approach emphasizes the smooth combination of visual and affective feedback, allowing audiences to engage with the artwork on an advanced level. This provides a foundation for incorporating biosensor data into the creative process, advancing artistic exploration and effective content development.
References
1. Ibsen, M., Rathgeb, C., Fink, T., Drozdowski, P. and Busch, C., 2021. Impact of facial tattoos and paintings on face recognition systems. IET Biometrics, 10(6), pp.706-719.
2. Saeed, U., 2021. Facial micro-expressions as a soft biometric for person recognition. Pattern Recognition Letters, 143, pp.95-103.
3. Hasnine, M.N., Bui, H.T., Tran, T.T.T., Nguyen, H.T., Akçapınar, G. and Ueda, H., 2021. Students’ emotion extraction and visualization for engagement detection in online learning. Procedia Computer Science, 192, pp.3423-3431.
4. Özkara, C. and Ekim, P.O., 2022, September. Real-time facial emotion recognition for visualization systems. In 2022 Innovations in Intelligent Systems and Applications Conference (ASYU) (pp. 1-5). IEEE.
5. Lin, Y., Yang, G., Ze, Y., Zhang, L., Xing, B., Liu, X. and Lyu, R., 2024. The Impact of Motion Features of Hand-drawn Lines on Emotional Expression: an Experimental Research. Computers & Graphics, 119, p.103897.
6. Morse, K.F., Fine, P.A. and Friedlander, K.J., 2021. Creativity and leisure during COVID-19: Examining the relationship between leisure activities, motivations, and psychological well-being. Frontiers in Psychology, 12, p.609967.
7. Gnacek, M., Broulidakis, J., Mavridou, I., Fatoorechi, M., Seiss, E., Kostoulas, T., Balaguer-Ballester, E., Kiprijanovska, I., Rosten, C. and Nduka, C., 2022. emteqpro—fully integrated biometric sensing array for non-invasive biomedical research in virtual reality. Frontiers in virtual reality, 3, p.781218.
8. Kim, H., Lee, K., Jo, G., Kim, J.S., Lim, M.T. and Cha, Y., 2020. Tendon-inspired piezoelectric sensor for biometric application. IEEE/ASME Transactions on Mechatronics, 26(5), pp.2538-2547. https://doi.org/10.1109/TMECH.2020.3041877
9. Persiani, S.G., Kobas, B., Koth, S.C. and Auer, T., 2021. Biometric data a real-time measure of physiological reactions to environmental stimuli in the built environment. Energies, 14(1), p.232. https://doi.org/10.3390/en14010232
10. Macruz, A., Bueno, E., Sol, G., Vega, J., Palmieri, R. and Zhao, B., 2024. Designing for well-Being: Using facial micro-expression analysis and EEG biosensor to evaluate human responses to 2D biophysically-driven geometries. Frontiers of Architectural Research, 13(2), pp.219-234.
11. Wu, Y., Zhang, X., Wu, T., Zhou, B., Nguyen, P. and Liu, J., 2024. 3D Facial Tracking and User Authentication through Lightweight Single-ear Biosensors. IEEE Transactions on Mobile Computing.
12. Zhang, J., Duan, Y. and Gu, X., 2021. Research on emotion analysis of Chinese literati painting images based on deep learning. Frontiers in Psychology, 12, p.723325.
13. Guo, H., Liang, X. and Yu, Y., 2022. Application of big data technology and visual neural network in emotional expression analysis of oil painting theme creation in a public environment. Journal of Environmental and Public Health, 2022(1), p.7364473.
14. Bian, J. and Shen, X., 2021. Sentiment analysis of Chinese paintings based on lightweight convolutional neural network. Wireless Communications and Mobile Computing, 2021(1), p.6097295.
15. Zhang, J., Miao, Y., Zhang, J. and Yu, J., 2020. Inkthetics: a comprehensive computational model for aesthetic evaluation of Chinese ink paintings. IEEE Access, 8, pp.225857-225871.
16. Cheng, K., 2024. Prediction of emotion distribution of images based on weighted K-nearest neighbor-attention mechanism. Frontiers in Computational Neuroscience, 18, p.1350916.
17. Liu, X., Zhou, H. and Liu, J., 2022. Deep Learning‐Based Analysis of the Influence of Illustration Design on Emotions in Immersive Art. Mobile Information Systems, 2022(1), p.3120955.
18. Wang, D., 2022. Research on the art value and application of art creation based on the emotional analysis of art. Wireless Communications and Mobile Computing, 2022(1), p.2435361.
19. Duan, Y., Zhang, J. and Gu, X., 2021. A novel paradigm to design personalized derived images of art paintings using an intelligent emotional analysis model. Frontiers in Psychology, 12, p.713545.
20. Muratbekova, M. and Shamoi, P., 2024. Color-emotion associations in art: Fuzzy approach. IEEE Access.
21. Xu, Y. and Nazir, S., 2024. Ranking the art design and applications of artificial intelligence and machine learning. Journal of Software: Evolution and Process, 36(2), p.e2486.
22. Wędołowska, A., Weber, D. and Kostek, B., 2023. Predicting emotion from color present in images and video excerpts by machine learning. IEEE Access, 11, pp.66357-66373.
23. Lu, Y., Guo, C., Dai, X. and Wang, F.Y., 2023. Generating emotional descriptions for fine art paintings via multiple painting representations. IEEE Intelligent Systems, 38(3), pp.31-40.
24. Tashu, T.M., Hajiyeva, S. and Horvath, T., 2021. Multimodal emotion recognition from art using sequential co-attention. Journal of Imaging, 7(8), p.157.
25. Chen, C.L., Huang, Q.Y., Zhou, M., Huang, D.C., Liu, L.C. and Deng, Y.Y., 2024. Quantified emotion analysis based on design principles of color feature recognition in pictures. Multimedia Tools and Applications, 83(19), pp.57243-57267.
26. Nolazco-Flores, J.A., Faundez-Zanuy, M., Velázquez-Flores, O.A., Cordasco, G. and Esposito, A., 2021. Emotional state recognition performance improvement on a handwriting and drawing task. IEEE Access, 9, pp.28496-28504.
27. Kumar, S., Rani, S., Jain, A., Verma, C., Raboaca, M.S., Illés, Z. and Neagu, B.C., 2022. Face spoofing, age, gender, and facial expression recognition using advanced neural network architecture-based biometric system. Sensors, 22(14), p.5160.
28. Martínez-Díaz, Y., Méndez-Vázquez, H., Luevano, L.S., Nicolás-Díaz, M., Chang, L. and González-Mendoza, M., 2021. Towards accurate and lightweight masked face recognition: an experimental evaluation. IEEE Access, 10, pp.7341-7353.
29. Chen, X., Manshaii, F., Tioran, K., Wang, S., Zhou, Y., Zhao, J., Yang, M., Yin, X., Liu, S. and Wang, K., 2024. Wearable biosensors for cardiovascular monitoring leveraging nanomaterials. Advanced Composites and Hybrid Materials, 7(3), p.97. https://doi.org/10.1007/s42114-024-00906-6
30. Wu, K.Y., Mina, M., Carbonneau, M., Marchand, M. and Tran, S.D., 2023. Advancements in Wearable and Implantable Intraocular Pressure Biosensors for Ophthalmology: A Comprehensive. Micromachines, 14(10), p.1915. https://doi.org/10.3390/mi14101915
Copyright (c) 2025 Author(s)
![Creative Commons License](http://i.creativecommons.org/l/by/4.0/88x31.png)
This work is licensed under a Creative Commons Attribution 4.0 International License.
Copyright on all articles published in this journal is retained by the author(s), while the author(s) grant the publisher as the original publisher to publish the article.
Articles published in this journal are licensed under a Creative Commons Attribution 4.0 International, which means they can be shared, adapted and distributed provided that the original published version is cited.