Connectivity Issues

The internet of bodies—and one alumna’s race to regulate it

Juliano Pinto, a man paralyzed from the waist down, became part of history when he performed a symbolic opening kick at the 2014 World Cup. Clad in a full-body robotic exoskeleton and a headset that monitored his brain’s electrical activity, the 29-year-old accomplished a feat once considered science fiction: he moved his leg simply by thinking about it.

Pinto is not alone—millions of human bodies now rely on software and the internet for some aspect of their functionality. As technology promises to change what it means to be human, there’s boundless hope that new breakthroughs can improve and enhance lives.

But for SESP alumna Andrea Matwyshyn (PhD05), a professor in the law and engineering schools at Pennsylvania State University, the body-tech merger—also called the internet of bodies (IoB)—raises important concerns about privacy, security, fraud, and human thriving.

The ability to damage the confidentiality, integrity, and availability of the software (and data) that connects devices to the human body, she says, creates both regulatory and ethical red flags that need to be addressed before people’s personal autonomy and safety are compromised. The legal and policy effort to safeguard our bodies has become a cornerstone of Matwyshyn’s work, which blends computer security, innovation, and the law. Though her four degrees from Northwestern give her expertise in multiple disciplines, including law, she routinely draws on her doctoral training in SESP’s Human Development and Social Policy program.

“Her work is truly interdisciplinary,” says Stephanie Pell, a fellow at the Brookings Institution who has collaborated on cyber­security policy research with Matwyshyn. “Andrea is extraordinarily capable of taking theoretical concepts and creating public policy where an outside-the-box solution is necessary.”

In 2017, Matwyshyn first described the IoB as a “network of human bodies whose integrity and functionality rely at least in part on the internet and related technologies.” Referencing the “internet of things”—the practice of connecting household and industrial devices to the internet—she warned that the same security flaws that have plagued IoT products will affect IoB devices, causing physical harm to bodies, and that the law isn’t ready to grapple with these issues when they arise.

The potential problems go beyond legal challenges. “As bits and bodies meld and as human flesh becomes permanently entwined with hardware, software, and algorithms, IoB will test our norms and values,” Matwyshyn wrote.

Her latest work blends IoB issues with other strands of her scholarship, such as her work on internet “fakery” and disinformation. As the world becomes “technologically messier,” she’s looking to the past for cautionary tales of pseudoscience. The history of body-sensing technologies and predictive analytics gone awry, she says, warns us against overtrusting sensors and their data classifications. 

For example, during the Salem witch trials of the late 1600s, women were often subjected to various sensory-data gathering, such as involuntary physical exams to find moles on their bodies as “scientific proof” of sorcery. Such “data” was prioritized and believed over personal testimony of the accused. In some cases, the accused themselves became convinced of their own guilt due to this “proof.”

Matwyshyn connects the present with the past by arguing that “the act of blindly trusting flawed sensors and manipulable data has implications for democracy, social trust, bodily safety, and personal privacy. It can also negatively affect mental health and self-­expression and meaningfully limit a human’s economic opportunities. I’m worried we’ll put ourselves in a situation where the perceived legitimacy of the data streams from flawed devices will be believed over the word and experienced sensations of the human beings connected to those devices.”


In her landmark article, Matwyshyn divides IoB into three generations. First-generation devices are on the outside of the body and can be something as unassuming as the smartphone you keep in your back pocket or a fitness-tracking watch that monitors your steps. Second-generation devices are inside the body and include things like pacemakers, artificial pancreases, and digital pills that rely on software to operate, as well as non­-medical objects like chips with crypto­currency wallets that people inject under their skin.

Third-generation IoB devices involve hardware embedded inside the brain, such as brain-computer interfaces that allow people to interact with external computers through their thoughts. Some of these devices are already in clinical trials in the private sector, and plans for their nonmedical uses worry Matwyshyn.

Imagine having a chip embedded in your brain and needing only to think of a search query for results to appear before your eyes. Maybe you catch yourself humming a familiar song and decide to stream it directly to your brain. It may seem convenient, but Matwyshyn says there’s a catch.

“As they’re envisioned, these third- generation IoB devices have the capability to both read and write to your brain. The information they will be generating is of very high value to everyone from future employers and marketers to insurers and, of course, malicious attackers. And because these devices may push personalized content into your brain, you may risk losing track of which information and ideas are really generated by you and which are someone else’s.”

But even with the first- and second­generation devices, the data our bodies generate is unlikely to stay with us, because of information-licensing business models in the technology ecosystem.

“It gets pushed out and merged with other information, and it gets repackaged and resold,” Matwyshyn says. “Suddenly you end up with a bundle of information that may or may not be accurate attached to you. You’re then forced to interact with the consequences, which could impact safety, employment, government inter­actions, or credit opportunities.”

In fact, insurers are already collecting data from medical devices. In some cases, people have been denied coverage for machines to treat sleep apnea if the device doesn’t consistently “phone home” to the insurer that it’s in use—even when patients and doctors explain that the problem is lack of reliable internet access, not disuse.


At Penn State, Matwyshyn also serves as the associate dean of innovation at the university’s law school and the founding faculty director of both the Penn State Policy Innovation Lab of Tomorrow, which focuses on interdisciplinary technology policy, and the Anuncia Donecia Songsong Manglona Lab for Gender and Economic Equity, a technology equity research lab and legal clinic. And she is a senior special adviser on law, technology, and the digital economy for the Federal Trade Commission’s Bureau of Consumer Protection.

As part of the computer-security community, Matwyshyn regularly speaks at conferences. The field moves at the speed of attackers—outpacing conference proceedings and journal articles—so she stays current by engaging with builders, breakers, regulators, and users on an ongoing basis.

What makes Matwyshyn stand out is her “direct, consistent involvement within the computer-security industry,” says Mark Stanislav, vice president of product security at FullStory, who has collaborated with Matwyshyn on federal public policy work. “It’s exceptionally different from most academics—let alone those in the legal profession.”

As she describes it, her work is ultimately about the arms race between two groups of “hackers”—those who are building new technologies and systems and those who attack and compromise them. Given the challenges to autonomy, thriving, and democratic processes that often arise from new technologies, the line between the two groups can easily become blurred.

“Every technology can be repurposed for problematic uses,” she says. “The goal is to create buffers of policy and law that discourage humans from behaving in ways that do harm while at the same time encouraging positive outcomes. The same knife that is used in a kitchen to make a brilliant salad can be used to hurt people, too.”

--By Maria Gardner