“Be Right Back” may be the most poignant episode of Black Mirror. The opening scenes quickly establish that Ash and Martha are a couple in love, thanks in no small part to the performances put in by Hayley Atwell and Domhnall Gleeson. We meet the two in a moving van. They sing along to the “If I Can’t Have You” (Yvonne Elliman), and then have a playful argument about the idea that Ash enjoys the Bee Gees. He ends up singing “How Deep is Your Love” and it is all very charming.
They arrive to their new home – which it would seem was that of Ash’s childhood – and there is a moment when he gets lost in social media, for which she chastises him. He posts a picture of a framed photo, claiming it to be funny, but when he tells the underlying story to Martha it is actually a bit sad.
The next day, they get up and need to return the van they had rented. They were supposed to go together, but Martha has got a job, and so Ash goes alone. And never comes back.
Martha’s experience is something most everyone can probably relate to, not just in terms of the sudden death of a loved one, but the waiting and worry that lead up to her knowledge of the event. Often it is the case that everything is fine, and there is some explanation other than the confirmation of one’s worst fears, but not in this case. When the police arrive, Martha answers the door and then shuts it and walks away. She hardly needs them to tell her that the conclusion she has already drawn is correct.
Death is a strange thing. In a sense it is always sudden, or unexpected. The person is simply gone. The mind does not quite know how to react; particularly if they were a large part of one’s daily life. You’ll almost forget, thinking that you want to tell them something, and then remember. Grief is a difficult thing. Books have been written about it, but it’s not clear how much they help.
And, so, the appeal of the technology at the heart of “Be Right Back” is clear. Who wouldn’t want to be able to talk to their deceased loved one again?
The technology in question works by pulling on the social media/public internet presence of the deceased. If you like it, you can then give it access to private information. Martha resists at first, but when she discovers she is pregnant, and can’t get her sister on the phone, she decides to give it a go, and quickly gets addicted to it. This is first a matter of text messages, or emails, and then later talking to him on the phone. Part of the charm is that this AI Ash acknowledges the potential creepiness involved in the whole thing. In particular when it comes to the final step, which he says is still in beta testing, but which Martha ultimately signs up for: the placement of this version of her late husband in a very realistic robot.
She is clearly creeped out by him at first, but then things improve somewhat, before getting worse and worse. What is of interest in thinking about the episode is why.
He doesn’t respond sexually because he has no record of Ash in this regard. What he does have is programming stemming from pornography, so she gets some enjoyment from that. He doesn’t need to sleep, and she gets irked at him for not pretending to do so well enough. She tells him to get out of the house, and awakes to find him standing in the yard.
And, ultimately, it is this obsequiousness of the AI Ash that causes the biggest issue. He basically does what she says. He won’t fight with her like Ash would have, and so on. When she finally takes him to a cliff and tells him to jump off, he seems prepared to do so, until she points out the way in which this is the issue: Ash would have begged and pleaded, etc. And, so, he begs and pleads.
Martha’s problem is boiled down in her line that he is a just a shell, with no history or depth. On the way to the cliff, he called the Bee Gees’ song on the radio cheesy (the very same song the real Ash playfully sang in the opening scene), and he previously held the framed photo in the house and called it funny (pulling no doubt on the social media post mentioned previously). He is only working from Ash’s online presence, and it’s not good enough. And, so, we have to ask: what if it were better? The AI here is limited by only having access to Ash’s internet history; what if it had access to more, perhaps enabled by something like the grain technology in “The Entire History of You”?
1) If we could have and interact with replica of our deceased loved ones, which were virtually indistinguishable from the people themselves, would that be a good idea?
Of course, to follow the premise of “Be Right Back,” we wouldn’t be talking about AI that possessed actual self-awareness… or would we, if they truly passed the Turing test?
2) What constitutes personal identity? Is it memory? Habit? A soul?
Philosophically speaking, the soul would be the thing that makes one substantially the same over the course of time. Rene Descartes may be an interesting reference point in this regard: I think therefore I am. In other words, Descartes claimed that the fact of my thinking proves my being. Many who came after would insist that he moved far too quickly from that to the conclusion that there is a “thinking thing” that is me in a substantial sense, but the idea is worth considering with regard to the question put on the table by “Be Right Back.” If there is a soul in this sense, and if, further, having one requires being able to carry out the method of doubt that leads to Descartes’ cogito, then it would be clear that something like the AI Ash falls short of the mark.
On the other hand, David Hume contended that there is no such thing as the soul, because no experience of it is available. If I look into myself, I always only find my present mental state: I am tired, or sad, or angry, or thinking about Black Mirror right now, but this is always in flux. If the soul would be the thing that is the same underlying all of these changes, an experience of it seems to be simply unavailable. Thus, Hume defined personal identity precisely in terms of habit. I am me, consistently, to the extent that I go along in the same way. I have habits of action, but also habits of thought, and so on. As human beings, we expect the future to resemble the past, and it is that same force of habit that gives rise to what is at the end of the day only the semblance of substantial self-identity over the course of time. Or, if there is any substance to the self, it is but this bundle of habits. On this view, the AI Ash, or a better version of the same, would be on the same spectrum, it would seem, as the rest of us. In particular if it could learn, or present novel behavior.
3) Is there real novelty? Can one do something absolutely new? If so, could it be possible for an artificial intelligence to do this?
After the events on the cliff, we cut to a few years down the line. Martha’s daughter is several years old, and the AI Ash is relegated to the attic. It becomes apparent that this is why Martha is keeping him around – so that their daughter can have some experience of her father. And, again, the appeal is visceral. As someone who lost a parent at a young age, I can hardly tell you how much I would want this. But would it be good? Perhaps there would be some positive impact on the psychology of the child, or it would mitigate the negative impact of the loss, but is there some limit here? Is the truth important? Is the AI Ash fundamentally a lie?
4) Is there an important lesson to learn from the death of others, and the acceptance thereof? If so, what might that be?
The most futuristic aspect of “Be Right Back” might be the realism of the body of the physical automaton that Martha orders, though even this does not seem too far off. The kind of social media presence used to enable the process already exists. One can already visit the Facebook pages of the dead. Is it creepy to keep them going? To tag them, or post on them? What if they could respond?