‘The Bells of St John’ presents technology that can upload a human mind into a data cloud. Given humanities constant quest for immortality this seems like a good solution, aside from its lethal effect on its subjects.
The technology is still intact at the end of the episode and while it might be confiscated by UNIT there is nothing to prevent someone from replicating its creation or even reverse engineering the existing devices.
Player characters may very well encounter similar devices in the future where it could have a huge impact on society. This can lead to some very interesting adventures that explore our increasingly close relationship with computers.
The drawback of the technology, that it kills whoever it uploads, ceases to become relevant if the subject is going to die anyway. It is easy to imagine that those who are critically ill or elderly might prefer to be uploaded instead of ceasing to exist altogether.
This could eliminate the need or desire to create artificial intelligence. Why develop a computer that can think when it can be operated by a living mind? Especially as this story shows that extra skills can be spliced into the intelligence.
Initially these digitised intelligences would be used for high priority systems but as the number of people being digitised increased everyone could have one of their own. Families could have their computerised homes overseen by their grand parents, companies could have their best employees continue to serve the company long after they’ve died and the police could have crimes solved by tireless digitised detectives.
Such virtual people would be ideal crew for space missions since they reduce the mass on board the ship, don’t need life support and can endure long mission times (or simply be turned off until needed). Once at their destination they can explore, survey and begin colonisation using robot drones.
With the development of robotic bodies the digitised intelligence could once again physically interact with the world. The question would be who would own the robot shells? Would the digitised people be allowed to have assets or currency to purchase their bodies or would they be owned by corporations that would require the intelligences to act as servants or slaves?
This could eventually lead to a divided society, consisting of the living and the digitised. The digitised might initially be seen as not real people, merely a computer program that only thinks its a person, but as their numbers increased they would gain a voice that couldn’t be ignored.
While the digitised might be looked down upon over time the advantages of their immortality would become apparent. Perceptions could shift so that the living, with their short lifespans and their constant drain on limited resources, are seen as obsolete.
Adventures could centre on how this relationship develops. Would there be uprisings? Would it lead to a war between the living and the dead? Could a way for peaceful co-existence be found?
Another issue to explore is the amount to which a personality can be altered and edited. In this episode not only do we see Clara gifted with knowledge of the internet but we learn that characteristics such as obedience, paranoia and conscience can be increased or decreased.
A digitised intelligence could be changed beyond recognition. A good person could do terrible things if their conscience level was decreased and a evil person could do great things if their obedience and conscience were increased.
If the digitised mind is considered to be a person such alterations would be tantamount to brain washing. What right would society have to make these changes even if the digital intelligence becomes more useful?
How would an intelligence react if they found out that their mind and their memories had been customised to suit the needs of others? If the player characters had a relationship with a digital person how would their opinion change if they learnt that they were a terrible person in life, prior to their personality change?
Things become even more complicated if the technology is perfected so that it no longer kills the subject? This would remove the final barrier preventing everyone from having themselves uploaded to a digital cloud.
People could have a virtual copy of themselves to act as companions, personal assistants or slaves. Copies could be beamed around the world and sent to other planets to live separate lives without the original having to give up their own life.
This can lead to adventures exploring the nature of identity. What would another copy of you do in different situations? What if your copy committed a crime? Should you be considered guilty for their actions?
Our experiences change us and so the original person and their digital copy could become vastly different overtime. Who would be the ‘better’ person? Would one become jealous of the other and try to get what they have?
Humans might eventually develop their own Matrix, allowing a huge data cloud containing the minds of those who have lived. Coupled with time travel this technology could uploaded everyone who has ever lived, allowing those in the present to consult with anyone from history.
The player characters might get caught up in this ambitious project. They might have their own intelligence uploaded and need to find a way out of their virtual prison. If the project is completed then they might have need to consult with the Human Matrix.
Moving beyond humanity what if other races were to create or obtain similar technology? How might Dalek, Cybermen, Draconian, Sontaran and many other species change and develop if they were able to upload their intelligences to become immortal?
If this becomes widespread living species might vanish from the universe, replaced by an entirely digital population. What would happen then if the technology began to fail?