Humans vs Machines: How We Will Be Treated in the Future

You are currently viewing Humans vs Machines: How We Will Be Treated in the Future

Based on research by Cecilia Price and Leo Petersen-Khmelnitski

Machine intervention into the human body in the name of health is an inevitability. Robotic surgery, AI-assisted clinical decision support tools, tissue engineering already exist. The question that remains is how this trend will progress and how the interaction between machines and humans will look in the future.

How humans interact with machines in healthcare?

Crucial to this conversation is the perception of machine intervention into a human body and mind. Though the actor-network theory and other schools of thought have studied computer-human interaction for decades now, popular perceptions of this interaction are still defined by Azimov’s Three Laws of Robotics, they seem to be relevant to healthcare, as they talk of harm. 

In the broadest sense, human–machine interaction (HMI) refers to the communication and interaction between a human and a machine via a user interface. In healthcare, HMI is about how clinicians and patients interact with technology.

The use of machines can be found in every part of medical treatment. HMI technology –  from a tiny pulse detector, which rests on your finger to a sophisticated AI in clinical decisions – constitutes a critical component of patient assessment, monitoring, and treatment delivery. Both patients and clinicians benefit from HMI information. HMI systems improve communication in medical settings. Through their assistance, medical professionals are able to communicate faster and more efficiently to ensure patients receive the care they need.

Difference in mistakes

On one hand, some anticipate that machines will be better than – and even overtake or replace – human doctors. Machines are widely considered to be more precise, more efficient; they do not get tired and, most importantly, they do not commit mistakes.

Most of this has proven to be true, for example in radiology and recognition-based diagnostics, the famous AI bias is proof that machines do make mistakes, but mistakes by machines are different to those made by clinicians.

While human errors in clinicians’ judgements and operations may be often attributed to lack of attention due to various external and internal distractions, the errors by health-related software are in most cases results of incorrect or biased data input or unintentionally pre-defined bias in its analysis.

Furthermore, whilst medical robots excel in repetitive tasks, they are currently less suited to situations where it is necessary deviate from the rules to achieve a unique result. Nowadays, most digital health tools are accessories to human doctors, be it clinical decision-making software or smart pills with built in cameras.

Future human-machine interactions: what to expect?

Machines will evolve to be an integral part to any healthcare related process, from early diagnostics to last days care, but this evolution demands that machine-human interactions also play an ever larger role in the delivery of healthcare, as well as in prevention efforts and keeping well.

Currently, three main relevant trends may be identified for HMI in healthcare:

  1. The growth of the number smart devices and related apps
  2. New forms of interaction with those devices
  3. New forms of AI employed to perform very specific tasks.

Now, what to expect in the future? Let’s have a look at what has already happened, what takes place now, and what is expected in human machine interactions

1960s

Keyboard & Mouse

Allowed text input, EHR database management, manual medical images processing, at mainframes

1980s

Desktop Computers and Graphical UI

Allowed to launch digitisation of healthcare professionals and medical research institutions

2014

Mobile Computing

Allowed full scale health networking, telehealth and remote treatments

2018

Conversational Interface

Allowed human AI interactions in healthcare for diagnostics and analytics

2020s

Virtual and Augmented Reality

May be used in all stages of healthcare HMI, from early prevention to palliative care

2030s

Brain Computer Interface

May allow sending thoughts and sensations, may serve as a base for neural healthcare, and for full scale tests on virtual copies

The two last HMI formats (VR/AR and brain computer interface) may lead to implementations of the following technologies that today seem rather improbable, due to a number of significant uncertainties. Some are presented in a Gartner review and in a research paper ‘Bridging Present and Future of Brain-Computer Interfaces: An Assessment of Impacts‘ by Dr Gabriel Velloso. While listing the technologies below, we do not mark them as close to implementations. Most such technologies are in the stages of conceptual development.

Loader image

Full immersive technologies allow information can be exchanged instantly with machines and other people. Mind control will allow us to share ideas, feelings, and memories with friends in an unimaginable way

Human perception could be improved through brain-machine interfaces. With our brains coupled with computers, artificially intelligent assistants, and the Internet, we may soon not only have instant access to the world's information, but also be able to download information into our brains or even merge it with super-intelligent AI systems. A wide range of ethical issues need to be addressed in this area

It is almost certain that invasive technologies will disrupt the healthcare industry as they enable the recovery of many diseases that are currently incurable: severe depression, PTSD, other mental illnesses, Parkinson's disease. Additionally, drug release might change completely, becoming more neurology- and data-driven as brain-machine interfaces become available

We can store any type of data transmitted by neurons, such as thoughts, memories, or feelings, by connecting our minds to computers. It is possible to create a digital copy of ourselves in the future and store it to one day create an immortal digital self. But a virtual twin is not only meant for the digital immortality, doctors will be able to test treatments on the digital twin of yourself, prior to testing on your offline version... on the physical you, that is.

Further reading

Research Papers on Human Machine Interaction

Harmonising Man and Machine

A journey into human/machine interactions in healthcare