Purdue University researchers have come up with a system that enables you to virtually hack-proof many of the electronic medical devices you currently have, or may eventually have, embedded in your body.
The capability could not be any more welcome, given that pacemakers, insulin pumps, and other technologies used in-body currently have no protection from cyber intruders.
"Academic research, including some studies in our team, have shown that most of these devices can be hacked in hours," says Bart Preneel, a professor specializing in cybersecurity and privacy at KU Leuven University in Belgium, who is familiar with the Purdue research.
One of the greatest fears regarding the hacking of implanted medical devices is that a hacker might fiddle with the settings of a device to make it malfunction and ultimately cause death, according to Eduard Marin, a research fellow in the security and privacy research group at the University of Birmingham in the U.K..
Hackers also have other incentives to manipulate in-body medical devices, Marin says. Hackers can break into under-skin medical devices to collect highly sensitive data from a patient's tech, track the movements of a tech wearer, or even conduct independent security tests to expose the security flaws of a device to the general public, according to Marin.
"For example, in 2016, a team of security researchers called MedSec allegedly investigated the security of several St. Jude Medical implanted heart devices, and concluded that these devices had serious security vulnerabilities," Marin says. "MedSec released a report jointly with the investment firm Muddy Waters, causing St. Jude shares to fall 10%."
Essentially, the Purdue system is designed to discourage these black arts by creating a 'covert body area network' that cannot be snooped on using conventional hacker technology, according to Shreyas Sen, an assistant professor of electrical and computer engineering at Purdue who was on the system's research team.
Key to the tech's innovation is its ability to reduce the reach of electromagnetic waves that computerized devices like pacemakers use to communicate in-body.
A conventional pacemaker, for example, typically generates an electromagnetic wave radius of approximately 10 meters, an easy hack for an accomplished snooper, according to Sen.
Indeed, some medical devices implanted under the skin have an even greater signal range. "Our research into implantable medical devices showed that it is possible for an adversary to wirelessly change the configuration of an implantable cardiac defibrillator from a distance of up to 20 meters," says Flavio D. Garcia, a professor of computer security at the University of Birmingham.
With Purdue's system, the reach of that electromagnetic wave is reduced to approximately one centimeter from the user's skin. A hacker looking to penetrate the system would need to position hacking equipment that could pick up a signal immediately adjacent to a user's skin, or in direct physical contact with the user, to break in.
The Purdue researchers were able to greatly reduce the range of the electromagnetic wave communications by using a special transmitter and receiver that can communicate on-body using 100 times less energy than a Bluetooth connection.
In commercial use, which researchers say requires further development, the transmitter could be incorporated into a smartwatch or similar device, and the receiver could be embedded into a pacemaker or similar device.
"The transmitter packetizes the data using an onboard microcontroller, encodes it on an electro-quasistatic (EQS) HBC (human body communication) frequency signal, and couples it to the human body with a copper coupler to transmit the EQS-HBC data," says Debayan Das, a graduate research assistant at Purdue who was on the system's research team. "The on-body receiver decodes the transmitted packets and performs error correction to reduce bit errors in the body channel."
In practical use, a doctor could use a smartwatch featuring the Purdue tech, for example, to configure settings of an insulin pump beneath the skin, or access usage data being generated by the insulin pump.
Other uses for the system include more secure wireless communications for doctors and researchers working with bioelectronic medical devices and brain-machine interfaces, according to Das.
While the system is not perfect (many IT security researchers and practitioners doubt any IT security system will ever offer perfect security), the Purdue system has impressed other researchers working in the same space.
"I think the research by Purdue University researchers is very powerful in that it drastically reduces the attack range to just a few millimeters," says the University of Birmingham's Garcia. "In this way, they mitigate most serious, realistic attack scenarios. Requiring direct physical contact to the skin has the additional feature in that it is hard for an attacker to inject messages into the body network without being noticed."
Preneel agrees. "This is fascinating research that shows that communication inside the human body is possible with much less energy and less radiation leakage than in traditional wireless communications and prior solutions for in-body communications. The electro-quasistatic communication method makes it much harder to pick up the signals outside the body. This technology clearly has the potential to improve security and privacy."
On the other hand, patients with pacemakers, insulin pumps, and other in-body devices could still be vulnerable if hackers use more powerful tech to sense signals even very close to the skin, according to Ben Ransford, CEO of Virta Labs, a maker of medical device security software. "Fixing security involves a lot more than changing the communication channel," Ransford says. "An attacker with resources doesn't care about a nominal reduction of communication range; they buy more powerful radios and better antennas. Most people will never run into such a hacker and shouldn't worry. Device makers should focus on basic security, such as patching hygiene and sensible, independently reviewed use of encryption."
Moreover, the Purdue tech does not protect against signal exchange if a hacker finds a way to make direct, physical contact with the skin of the user. "In crowded spaces, one cannot exclude that an attacker makes contact with a target," says Preneel.
Dave Singlee, industrial research manager at KU Leuven University, agrees, adding that making a successful hack once you've achieved physical contact is relatively easy. "One could expect that the time to perform an attack would be in the order of a few seconds, so the attacker would only need to touch the patient for a very short time."
There is also the problem of unintended amplification of a signal, which might occur when a user touches metal, a smartphone, or a similar object that could amplify the system's signal beyond the 1-centimeter range.
As a result, many researchers familiar with Purdue's work stressed that the tech should be used in concert with other layers of security, such as encryption and authentication.
"The proposed solution has some merit and offers some interesting security properties, but still needs to be combined with other security solutions in most practical settings," says Singlee. Flavio agrees, "Security should be built in multiple layers, in case there is a flaw in one of them."
Purdue's Das wholeheartedly agrees the system should be used with multiple layers of security, including encryption. As for problems with signal amplification: "We plan to write a follow-up detailed theoretical paper showing many more similar interactions," along with ways to mitigate such scenarios, Das says.
Despite the limitations, Das says the Purdue research team is convinced the tech has real commercial value. They're currently working to shrink the size of the circuitry needed to run the system so that it can be easily incorporated into everyday electronics, he says.
Joe Dysart is an Internet speaker and business consultant based in Manhattan, NY, USA.