Using XR and wearable AI to support people in custody to build the skills and identity needed for life after release, with the aim of reducing reoffending. The project draws on social identity theory and takes a participatory "nothing about us, without us" approach. We are currently interviewing frontline workers from prison services, charities, and third-sector organisations, and analysing the data using reflexive thematic analysis. A public engagement event to share findings, seek feedback, and map the problem space is planned shortly. In collaboration with Middlesex University and Bournemouth University.
Publications
As AI-generated content becomes hyper-personalised and immersive displays become ubiquitous, the potential for deception and manipulation in XR environments grows substantially. This project investigates how people can be safeguarded against these harms. Following an initial round of interviews, Sebastian Vowles was recruited and is now leading this line of inquiry as part of his PhD, funded through the John Anderson Research Studentship Scheme (JARSS) at Strathclyde. The project is actively collaborating with partners at Universitatea "Ștefan cel Mare" Suceava and KTH Royal Institute of Technology.
Publications
Can theatre help the public recognise and resist online manipulation? This project used devised performance as a research method, combining computer science, cybersecurity, AI, and psychology. I collaborated with professional theatre-makers to devise a play, then went on stage and performed it with them at the Science Gallery London in front of an audience of 66 members of the public. The performance was evaluated through quantitative and qualitative studies. Led by King's College London, in collaboration with the University of Nottingham, University of Liverpool, and Middlesex University.
Dark patterns are deceptive interface designs that manipulate users against their own interests. In XR environments — where perception, space, and sensing interact in novel ways — these patterns take on amplified and novel forms. This project, funded by a 2022 Meta Research Award, investigated how dark patterns manifest in augmented and virtual reality and developed a framework for identifying and mitigating them. Led by Mohamed Khamis at the University of Glasgow.
Publications
A two-year post-doctoral fellowship funded by the College of Science and Engineering at the University of Glasgow, investigating the application of generative AI in metaversal environments and its implications for users. The work produced multiple publications in top HCI venues including ACM CSCW, NordiCHI, IMWUT, and TOCHI, and directly seeded the subsequent JARSS-funded PhD project at Strathclyde.
A summer internship project at the University of Glasgow, in collaboration with BBC R&D, investigating how television viewers perceive and trust AI-generated companion content. Findings were presented at BBC R&D and at the University of Glasgow.
A summer internship project at the University of Glasgow investigating how extended reality technologies could be exploited to disinform television audiences — exploring the intersection of immersive media, misinformation, and viewer trust.