3
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Ethical Aspects of Faking Emotions in Chatbots and Social Robots

      Preprint

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Telling lies and faking emotions is quite common in human-human interactions: though there are risks, in many situations such behaviours provide social benefits. In recent years, there have been many social robots and chatbots that fake emotions or behave deceptively with their users. In this paper, I present a few examples of such robots and chatbots, and analyze their ethical aspects. Three scenarios are presented where some kind of lying or deceptive behaviour might be justified. Then five approaches to deceptive behaviours - no deception, blatant deception, tactful deception, nudging, and self deception - are discussed and their implications are analyzed. I conclude by arguing that we need to develop localized and culture-specific solutions to incorporating deception in social robots and chatbots.

          Related collections

          Author and article information

          Journal
          19 October 2023
          Article
          2310.12775
          c4be318d-2cff-4535-ac80-e42fd2766999

          http://creativecommons.org/licenses/by-nc-nd/4.0/

          History
          Custom metadata
          Proceedings of Ro-MAN 2023, Busan, South Korea, Aug. 28-31, 2023
          6 pages
          cs.RO

          Robotics
          Robotics

          Comments

          Comment on this article