At 20fifty, our catalyst resides in the drive to Build What Matters. The line, however, must be held to build what matters responsibly. As we peek into the not-so-distant future, the unavoidable conversation of Artificial Intelligence (AI) and ethics cannot be ignored.
In this thought experiment, Catherine Lückhoff explores the diverse (and often unexpected) effects attached to the collection of personal data by AI powered wearables and in-home devices – technology that already exists in 2022. Inspired by the discussions with the Human Embodiment community on Exponential View, Catherine was asked by the community leader, Robbie Stamp, CEO of BIOSS International, Chairman of h2g2.com (The Earth Edition of the Hitchhiker’s Guide to the Galaxy) and a Member of the BSI’s National Standing Committee on AI (SC 42 for the Hitchhiker Fans!), to think about the role that AI could/will play in our personal interactions in the near future. We asked Robbie for a comment, and here’s what he had to say:
Life in 2026
It’s Friday afternoon. David and Sara walk into the kitchen, arms laden with groceries – plant-based of course – with a sneaky piece of overpriced smoked salmon for David who still craves sustainably caught animal protein. On the kitchen table, yesterday’s paper dated Thursday, 27 May 2026, lies open on the travel section. The azure blue seas of the Maldives beckoning the duo towards a break from their routine. The two haven’t spoken much on the autonomous car ride home, both lost in their own thoughts of the week that was, and the promise that the weekend brings.
As they unpack their groceries, Sara asks: “Babe, did you remember to confirm the holiday yet? I know you must be getting the reminders as I asked Alexa to send you one a day until the task was complete. I know you; you need at least 10 reminders to get anything done.”
Recording Your Life, Live
That’s when David’s heartbeat increases by 20 beats a minute, his adrenaline spikes and, although unbeknownst to him, his pupils dilate and his sweat glands release what will soon become a fine sheen across his forehead and top lip. His LiFE tracker has already sent this data to Johari and the built in state of the art smart home system, complete with the Bose speakers he ripped a hole in his credit rating for, is quietly monitoring his tone of voice, intent and cadence. Every word is mapped against his private and open conversations recorded over the past 6 months, ever since they signed up to the AI counselling service.
David: “I did not.”
Sara: “Ugh, you always do this.”
David: “Do what?”
Sara: “Forget the stuff that actually matters. I mean, if this was a motorbike trip with Graham, you would have confirmed the whole thing months ago.”
David: “That’s such BS and you know it.”
AI and Accountability
Sara had been complaining about this very situation to her sister, just two nights ago. Her sense that David was becoming less and less interested in their joint happiness or making a success of their marriage, was only compounded by her increasing need to “be seen and heard”. She knew turning 45 would be hard, but she didn’t realise quite how invisible she would feel. As an introvert, she was grateful for her weekly Johari sessions, and the emotional support trigger that alerted Ruth when Sara, who is not one to reach out at the best of times, really needed a shoulder to cry on. Ruth had pinged her at a time that their linked Johari programmes had updated her would be convenient. Although still a bit freaked out by how well the system worked, Sara was always grateful for the well-timed calls.
Back to the Future
Back in the kitchen, the conversation had devolved into a full-blown argument and although he tried to stop her, Sara uttered the dreaded word,” ARBITRATE”, before David could put a stop to it. He hated the damned thing and resented how Sara had insisted they sign up to Johari to “help their marriage”. Who gives a damn that it worked for Ruth and Bob!? They were becoming more Stepford by the day.
With both parties firmly alerted by their wearables, a watch for David and the in-ear device for Sara, that an arbitration was underway, Johari only stopped pinging when both parties kept quiet.
Enter Johari
Johari: “As you both probably know from our previous session, the issue you are fighting about is not actually the problem. David, on 20 May, you promised Sara that you would indeed review the trip that Alexa customised based on both your holiday preferences. This trip would ensure you are both accommodated in terms of likes and dislikes. Yes, this trip may be slightly skewed towards Sara this time round, but let’s face it, last year’s skiing trip was more to your liking. Your vitals show that something else is nagging at you? Before I tell you, can you think of what this might be?
David: “Uhm…..no…”
Johari: “Well, what I can glean from your vitals, irregular sleep patterns, REM and conversations for the past 6 weeks, you are feeling very disempowered both at work and in the home. Your exercise activity has dropped by 30%, your sex drive is the lowest it has been in 3 years, and your IBS is flaring up. You laugh, on average, 50% less than before. You hardly pay anyone any compliments and your screen time has more than doubled. From what I can tell, the trigger can be traced back to 1 May. Do you want to tell Sara what you are feeling and what you think that trigger may be?”
David: “Not exactly. I mean, is nothing private anymore? Bloody hell, I can barely pull my zipper down without anyone listening.”
Sarah: “Babe, don’t be so passive aggressive.”
Johari: “Sara, it is not your turn to speak. Please allow David the space to come to grips with his own feelings. We will tackle your issues in a minute.”
Johari: “David?”
Although reluctant at first, David gives in and admits that his demotion on 1 May, communicated via email – email of all things! – has left him feeling worthless. At the very least, he didn’t think that a robot would replace him as a physiotherapist one day. Nonetheless, here he was at his point in his career: “human-in-the-looping” for a Boston Robotics physiobot while it manipulates painful intercostal muscle spasms.
Sara, in turn, finally admits that, after hearing Johari’s overwhelming mountain of evidence, her ever increasing level of contempt and disapproval of everything David does could (and would!) wear anyone down. They both recommit to weekly sessions and increased tracking for the next 3 weeks.
In accordance with their non-secular preference settings, Johari ends the session with a quote, from Alain de Botton’s book, The Course of Love: “I will never be able to do or be everything you want, nor vice versa, but I’d like to think we can be the sort of people who will dare to tell each other who we really are. The alternative is silence and lies, which are the real enemies of love.”
Johari: “It is a good thing I am here to tell you who you really are…”
About the Thought Experiment
When writing this piece, I was thinking about just how far we are willing to go when it comes to allowing AI to direct our lives and relationships. As we forge the future, these decisions truly are up to us. I hear too many people say that they simply don’t care what personal data is being captured. “I have nothing to hide” is one of the many flippant responses. We can’t even begin to imagine how much agency we are willing to give up and how the incremental erosion of that privacy is already beyond the tipping point.
“The Web as I envisaged it, we have not seen it yet. The future is still so much bigger than the past.”
Tim Berners-Lee, Inventor of the World Wide Web
The expanding ethical debates and ever-growing use cases for AI simply cannot be ignored. But, we must stop and think before we build and relinquish our privacy. The future may not be as dystopian as we shudder to think, but it is only us who can direct it.