Welcome to Francis Academic Press

Academic Journal of Humanities & Social Sciences, 2025, 8(9); doi: 10.25236/AJHSS.2025.080902.

Designing Intimacy: Applying Norman’s Emotional Design Theory to Human-AI Companionship

Author(s)

Yinyuan Peng

Corresponding Author:
Yinyuan Peng
Affiliation(s)

Institute of Artificial Intelligence, Guangzhou Huashang College, Guangzhou, China

Abstract

AI companions, powered by advances in large language models and conversational AI, are increasingly designed for not only functional utility but also emotional engagement. As these systems evolve into entities capable of simulating intimacy, understanding the design mechanisms behind human-AI emotional relationships becomes critical. This study applies Norman’s three levels of emotional design to analyse how affective bonds with AI companions are formed, sustained, and deepened. By mapping each design level to stages of emotional interaction, the framework reveals psychological processes and ethical challenges, including authenticity, anthropomorphism, and projected intimacy. The analysis highlights the need for interdisciplinary approaches integrating design theory, psychology, and ethics, offering a structured basis for creating emotionally intelligent AI companions with both engagement and responsibility in mind.

Keywords

AI Companion; Emotional Design; Human-AI Interaction; Artificial Intelligence

Cite This Paper

Yinyuan Peng. Designing Intimacy: Applying Norman’s Emotional Design Theory to Human-AI Companionship. Academic Journal of Humanities & Social Sciences (2025), Vol. 8, Issue 9: 7-13. https://doi.org/10.25236/AJHSS.2025.080902.

References

[1] Bergner, A. S., Hildebrand, C., & Häubl, G. (2023). Machine talk: How verbal embodiment in conversational ai shapes consumer–brand relationships. Journal of Consumer Research, 50(4), 742–764. 

[2] Bobrowsky, M. (2025, May 7). Zuckerberg’s grand vision: Most of your friends will be AI. Retrieved May 30th, 2025, from https://www.wsj.com/tech/ai/mark-zuckerberg -ai-digital-future-0bb04de7

[3] Brandtzaeg, P. B., Skjuve, M., & Følstad, A. (2022). My AI friend: How users of a social chatbot understand their human–AI Friendship. Human Communication Research, 48(3), 404–429. 

[4] Caldarini, G., Jaf, S., & McGarry, K. (2022). A literature survey of recent advances in chatbots. Information, 13(1), 41. 

[5] Chu, M. D., Gerard, P., Pawar, K., Bickham, C., & Lerman, K. (2025). Illusions of intimacy: Emotional attachment and emerging psychological risks in human-ai relationships. arXiv preprint arXiv:2505.11649. 

[6] Ciriello, R., Hannon, O., Chen, A. Y., & Vaast, E. (2024). Ethical Tensions in Human-AI Companionship: A Dialectical Inquiry into Replika. Proceedings of the Annual Hawaii International Conference on System Sciences/Proceedings of the Annual Hawaii International Conference on System Sciences. https://doi.org/10.24251/hicss.2024.058 

[7] Croes, E. A. J., & Antheunis, M. L. (2021). Can we be friends with mitsuku? A longitudinal study on the process of relationship formation between humans and a social chatbot. Journal of Social and Personal Relationships, 38(1), 279–300. doi: 10.1177/0265407520959463

[8] De Freitas, J., Uğuralp, A. K., Uğuralp, Z., & Puntoni, S. (2024). AI companions reduce loneliness. SSRN Electronic Journal. https://doi.org/10.2139/ssrn.4893097

[9] Desmet, P. M. A. (1999). ‘To love and not to love: Why do products elicit mixed emotions’. Proceedings of the 1st International Conference on Design and Emotion, Delft, The Netherlands, 3–5 November 1999 [online]. Available at: http://www.designandemotion.org/ [accessed 30 May 2009]. 

[10] Desmet, P. M. A. (2003). ‘A multilayered model of product emotions’. The Design Journal (Special Edition on Design and Emotion), 6(2), 4–13.

[11] Desmet, P. M. A. and Hekkert, P. (2009). ‘Special issue editorial: Design & emotion’. International Journal of Design, 3(2), 1–6. 

[12] Frijda, N. H. (1986). The Emotions. Cambridge and New York: Cambridge University Press.

[13] Gillath, O., Ai, T., Branicky, M. S., Keshmiri, S., Davison, R. B., & Spaulding, R. (2021). Attachment and trust in artificial intelligence. Computers in Human Behavior, 115, 106607. doi: 10.1016/j.chb.2020. 106607

[14] Li, H., & Zhang, R. (2024). Finding love in algorithms: deciphering the emotional contexts of close encounters with ai chatbots. Journal of Computer-Mediated Communication, 29(5), zmae015. doi: 10. 1093/jcmc/zmae015

[15] Lim MY. (2012). Memory models for intelligent social companions. In Human-computer interaction: The agency perspective. Springer Berlin Heidelberg: Berlin, Heidelberg; 241–262. https://link.springer. com/chapter/10.1007/978-3-642-25691-2_10 [Accessed May 12, 2024].

[16] Maples, B., Cerit, M., Vishwanath, A., & Pea, R. (2024). Loneliness and suicide mitigation for students using gpt3-enabled chatbots. npj Mental Health Research, 3(1), 4. doi: 10.1038/s44184-023-00047-6

[17] Mou, Y., & Xu, K. (2017). The media inequality: Comparing the initial human-human and human-ai social interactions. Computers in Human Behavior, 72, 432–440.

[18] Norman, D. A. (2004). Emotional Design: Why We Love (or Hate) Everyday Things. New York: Basic Books

[19] Pentina, I., Xie, T., Hancock, T., & Bailey, A. (2023). Consumer–machine relationships in the age of artificial intelligence: Systematic literature review and research directions. Psychology & Marketing, 40(8), 1593–1614. doi: 10.1002/mar.21853

[20] Pesty, S., & Duhaut, D. (2011, December). Artificial Companion: building a impacting relation. In 2011 IEEE International Conference on Robotics and Biomimetics (pp. 2902-2907). IEEE.

[21] Savic, M. (2024). Artificial companions, real connections? M/C Journal, 27(6). https://doi.org/10. 5204/mcj.3111 

[22] Skjuve, M., Følstad, A., Fostervold, K. I., & Brandtzaeg, P. B. (2021). My chatbot companion - a study of human-chatbot relationships. International Journal of Human- Computer Studies, 149, 102601. doi: 10.1016/j.ijhcs.2021.102601 

[23] Xie, Q. (2024). Open AI-Romance with ChatGPT, Ready for Your Cyborg Lover? arXiv (Cornell University). https://doi.org/10.48550/arxiv.2410.03710

[24] Zhang, Y., Zhao, D., Hancock, J. T., Kraut, R., & Yang, D. (2025). The Rise of AI Companions: How Human-Chatbot Relationships Influence Well-Being. arXiv preprint arXiv:2506.12605.

[25] Zhou, L., Gao, J., Li, D., & Shum, H. (2020). The design and implementation of XiaoIce, an empathetic social chatbot. Computational Linguistics, 46(1), 53–93. https: //doi. org/10. 1162/coli_a_00368.