Pin It
Cognify, the ‘prison of the future’
Courtesy of Hashem Al-Ghaili

Inside Cognify, the ‘prison of the future’ where AI rewires your brain

A new prison concept proposes bombarding criminals with artificial memories from the perspective of their victims – what could go wrong?

Loading the Elevenlabs Text to Speech AudioNative Player...

Earlier this week, a short film was shared to Instagram, showing what the “prison of the future” will apparently look like, and most people didn’t like what they saw. In case you missed it: the digitally-rendered ‘dystopia’ involves sending offenders into Matrix-style pods in a gleaming science facility, where they’re fitted with headsets – named a Cognify device – and fed a stream of AI-generated content. The aim? “[To] create and implant artificial memories directly into the prisoner’s brain.”

If a person was a violent offender, for example, they could be forced to watch their crime from the victim’s perspective. Drug-related crimes might be punished with fake memories that “simulate the struggles of addiction and recovery”. At the same time, emotional states like remorse and regret would be triggered by tweaking neurotransmitters and hormones in real time.

As explained in the video, prisoners would be able to choose Cognify as an alternative to a traditional prison sentence. Why would they ever agree to that, you ask? Well, treatment could potentially take place in just a few minutes, and then – like A Clockwork Orange droog Alex DeLarge – they would be free to reenter society, avoiding years or even decades behind bars. Yes, they might think they’ve spent years inside a carceral hellscape of ultraviolence tailored specifically to their personality and psychological state, but they won’t have aged a day!

All of this might sound like science fiction, and it is speculative, but it’s actually based on real science. Scientists have already successfully implanted false memories in mice, and have learned how to change their fearful memories into happy ones. In 2018, meanwhile, researchers transferred a memory from one marine snail to another, a year after they figured out how to code a film clip into the DNA of the gut bacteria E. coli. (What a world we’re living in.) This research, alongside new AI developments, like OpenAI’s tex-to-video model Sora, forms the technical basis for Cognify.

The actual idea behind Cognify came from Hashem Al-Ghaili, a Berlin-based filmmaker and science communicator. In part, he tells Dazed, the system was inspired by the “limitations of the current criminal justice system” – which are plain to see. Specifically, Al-Ghaili cites problems such as false imprisonment, overcrowding, and ineffective rehabilitation. All of these would be addressed by his AI-powered, fast-track prison replacement, he says, which would aim to “by provid[e] a more effective path to reformation and societal reintegration”.

Nevertheless, something tells us that Angela Davis won’t be rushing out to buy shares in a Cognify startup any time soon. The process itself would begin with a high-resolution brain scan that would be used to create a map of the prisoner’s brain, which would tell Cognify where to ‘inject’ the artificial memories and how to tailor them to that brain’s specific structure. These fake memories would be used to influence the prisoner’s future behaviour, which seems intrusive enough, but in the meantime the prisoner would also be handing over their brain data to a “central computer for scientific research” to help “understand the criminal mind” and “determine the best approach” to tackle future crimes. Does this sound like the beginnings of a fascist technocracy? A bit. But sorry, if you don’t like it you should have read the small print!

We’ve already talked about the emerging issue of neurorights – and how they’re threatened by novel technologies – in this article on brain uploading, but Cognify raises a whole new host of these ethical questions. “There are concerns about consent, privacy, and the potential for unintended psychological consequences from altering memories, even if they are artificial,” Al-Ghaili admits. “Implanting artificial memories also raises questions about the authenticity of self.”

“There are concerns about consent, privacy, and the potential for unintended psychological consequences from altering memories, even if they are artificial” – Hashem Al-Ghaili

Then, there’s the question of how to reintegrate someone into society with just a few minutes of treatment but years of ‘memories’ to show for it. “Family members of the subject could be provided with a comprehensive report on the new artificial memories,” suggests a Cognify press release, but the cognitive dissonance would be crazy. And this is if everything goes right. Needless to say, extensive research would also be needed to make sure the system is “error-free”.

All of these doubts, and more, are on display in Al-Ghaili’s comment sections on Instagram and X. Critics compare the system to dystopian fiction like Black Mirror, as well as historical practices like brainwashing or lobotomy. Al-Ghaili disagrees. “I fully endorse its existence,” he says of the concept, arguing that it would help reduce costs, get offenders back in society faster, reduce reoffending rates, and create “safer communities for all”.

“I don’t think it makes sense to base the outcomes of any technology on dystopian plots,” he adds. “Sure, we need to be cautious, but we also need to give technology a chance to make our lives better.” And he finds comparisons to the grim procedures of the past particularly “absurd”. “Brainwashing using Cognify is very unlikely to happen if such technology is fully protected from falling into the wrong hands.”

That’s a very big “if” though. We’ve already seen how some of the most developed nations on Earth abuse their power and ignore human rights (thanks Wikileaks). Why should we expect them to act any differently when given the opportunity to fiddle with prisoners’ brains? Could security services really resist popping on a headset and violating a criminal’s neurorights to get some juicy information out of them in an interrogation room? Could they resist implanting a purpose-built memory to get a confession?

“If we are overly concerned about every technology we want create, we will create nothing,” Al Ghaili says. “Every technology carries risks, but with strict ethical standards, legal frameworks, and oversight, we can overcome these risks.” Unfortunately, it’s still unclear how we’d enforce the necessary oversight and restrictions when the tech is supposedly in the hands of the good guys. Do we really trust the people who make the rules with the keys to our memories and emotions? If you answered yes, you might already have been Cognified.

Download the app 📱

  • Build your network and meet other creatives
  • Be the first to hear about exclusive Dazed events and offers
  • Share your work with our community
Join Dazed Club