Can bots possess empathy?

Author by Nathan Lasnoski

I was pursuing my Facebook feed the other day and came across an advertisement for Replika, an “Artificial Intelligence Friend”, which offers to be a virtual personal relationship, providing a vehicle for self-expression, understanding, and feedback. We’ve talked about Empathy in the context of bots or applications, but this is a whole new level. This is an artificial intelligence that offers to replicate a human-to-human relationship in some capacity. Absent the existence of Artificial General Intelligence, is this a good thing?

In my mind… “no”, it isn’t a good thing. The most important thing to remember about empathy in product development is that empathy is ONLY meaningful when it represents the understanding from one person to another, not as a complete substitute of that relationship. In other words, bots can convey empathy but do not possess empathy. Empathy is created by a human person conveying receiving and conveying understanding to another human person. Let’s examine why.

Let’s imagine you have a friend that is going through a hard time. You don’t really have time for the friend, so you give only a half concerned attempt at understanding their circumstance. The friend notices this and is hurt because they needed to be seen and heard, but instead they felt ignored. Ask… why are they hurt? Is it because you didn’t say the right words, in the right order, or with the right duration? No. They were hurt because they the human-to-human connection was broken.

Why didn’t they simply fulfill this relationship by talking to their pet? Studies show that people with pets live more fulfilled lives, why not just replace the human relationship with the pet? Again… the pet is not a substitute for the empathy missing between you and your friend. It may temporarily subside the problem, but it doesn’t replace what is missing. Even in this scenario you have a living interaction that can partially convey connection.


There is a unique characteristic to the relationship between people because it represents a meeting of equals. Consider the closeness of relationship that exists between two people who have gone through a similar cancer diagnosis. There exists a powerful closeness to the sharing of experiences that each person can convey. They not only experienced similar circumstances, but also experienced a similar nature of those circumstances that was component to their physical being. This is what we experience as people and why empathy between us is important.



So… why can’t bots have empathy. The reason a bot itself cannot have empathy is because it is like saying the words without having the meaning. A bot’s empathy is only meaningful when it represents empathy from a person to another, or in aggregate, such as a company’s customer service organization to customers. That empathy is only meaningfully recieved if it is conveyed in a way which is understood as genuine and conveyed from another person with honesty and integrity. Here is what I mean… you’re missing the source (not to say that every bot engagement has a human puppet master, but that the bot doesn’t replace actually caring about the employee or customer)



So… should you not build your bot to include empathy? Far from it! Empathy is a necessary and required part of any bot development project. You should not however build a bot with empathy without including care, honesty, and integrity in the company or person backing it up. The entire Customer Experience must look, feel, and smell truthful to the relationship you are trying to convey. Here is a working model, where the human informs and could be part of the escalation process of a bot, consistent with the language and engagement in the process:



For example, here is empathy conveyed in a Smart Restart bot, which is trying to tell the employee “we’re glad you’re back” and complement the HR team in welcoming employees:

Here also is the interaction, where you can see richness of conversation, not just “yes”, “no”, but instead language that respects the receiver. This also supports a warm hand-off to a HR representative to assist with a more complex question the bot can’t answer.

You can also see here an example of a Temperature Testing App which includes empathy as an element of the experience, understanding that this is an intrusive, annoying, icky experience.

Finally… Does that exist in a virtual friend? At least for now, I think something is missing there, not because it cannot approximate a human conversation, or even that the inventors are not well intentioned. Instead, it’s that the interaction with the bot will not replace a human relationship and instead actually makes seeking a human relationship less likely, rather than more likely. Sorry Minuete, you aren’t quite a Data yet.

Nathan Lasnoski

Author

Nathan Lasnoski

Chief Technology Officer