How Does Technology Address Our Emotional Needs in Modern Society?; How can we interpret the rise of conversational systems simulating the dead? How should we understand this phenomenon, assess its potential risks, and determine its legal boundaries?
During my doctoral studies, I interrogated the emergence and adoption of postmortem avatars designed to replicate deceased relatives. In the process, I introduced the term ghostbots to define these systems; a term that, I believe, harmonizes with various other terms in the extended literature (and indeed, there are so many in the existing literature: deadbots, griefbots, thanabots, etc).
My doctoral research makes three key contributions. First, it provides an explanation of the emergence of ghostbots from a technocultural perspective. Second, it identifies and categorizes the different harms for the users of this type of technology. Third, it offers a critical examination of current legal frameworks. I argue that, although existing data protection and AI regulatory models might offer limited oversight, they fall short of effectively addressing the substantial challenges associated with ghostbots. In response, I propose an expanded legal assessment and the advancement of alternative mitigation strategies. Ultimately, my thesis demonstrates that the digital has material consequences for end users, and that the most ambitious legal paradigms may prove too porous to govern the simpler, yet equally intriguing, forms of AI adoption and engagement.
Throughout this scholarly journey, I have been profoundly fortunate to receive the thoughtful guidance of Edina Harbinja (University of Birmingham) and Benjamin Farrand (Newcastle University), whose insights continually pushed and tested my ideas. Ben’s critical framing of the interplay between law, politics, and the regulation of emerging technologies, along with Edina’s pioneering work and world-leading research in postmortem privacy, have been instrumental in shaping my research.
Reflecting on my doctoral experience, I can say I never intended my work to evolve into a narrowly defined research path. This is evident in the way I now discuss the ‘ghostbot’ topic: as more of a case study than a lifelong vocation. I don’t really see myself doing ghostbot research in the future, and I view this not as a limitation but as a natural evolution of academic interests. I’ve always argued that a PhD isn’t necessarily about becoming an expert on a static topic. After all, the intellectual landscape is ever-changing, and claiming absolute mastery is, frankly, pretentious. Instead, the doctoral experience is about acquiring new skills and refining existing ones to interrogate social realities and conduct meaningful research (in my case, the legal analysis of digital technologies). For me, the PhD represented this transformative phase: a period of growth that enabled my transition from a public policy officer and early judiciary practitioner into a full-time academic. Naturally, the brilliant minds I’ve met along the way and the conversations I’ve shared with them have been (by far) the most transformative part of the journey. That goes without saying.
I remain deeply grateful to Lilian Edwards (my first supervisor until her retirement), as well as to Ben, Edina, Newcastle Law School, the Alan Turing Institute, and the AHRC, all of whom played vital roles in this transformative process.