Prescribing Exploitation

Charlotte A. Tschider

Patients are increasingly reliant temporarily, if not indefinitely, on connected medical devices and wearables, many of which use artificial intelligence (“AI”) infrastructures and physical housing that directly interacts with the human body. The automated systems that drive the infrastructures of medical devices and wearables, especially those using complex AI, often use dynamically inscrutable algorithms that may render discriminatory effects that alter paths of treatment and other aspects of patient welfare.

Previous contributions to the literature, however, have not explored how AI technologies animate exploitation of medical technology users. Although all commercial relationships may exploit users to some degree, some forms of health data exploitation exceed the bounds of normative acceptability. The factors that illustrate excessive exploitation that should require some legal intervention include: (1) existence of a fiduciary relationship or approximation of such a relationship, (2) a technology-user relationship that does not involve the expertise of the fiduciary, (3) existence of a critical health event or health status requiring use of a medical device, (4) ubiquitous sensitive data collection essential to AI functionality, (5) lack of reasonably similar analog technology alternatives, and (6) compulsory reliance on a medical device.

This Article makes three key contributions to the existing literature. First, this Article establishes the existence of a type of exploitation that is not only exacerbated by technology but creates additional risk by its ongoing use. Second, this Article illustrates the need for cross-disciplinary engagement between privacy scholarship and AI ethics scholarship, both of which could balance data collection for fairness and safety with other individual interests. This Article then illustrates how a modern information fiduciary model could neutralize patient exploitation risk when exploitation exceeds normative bounds of community acceptability.

Next
Next

Disclosure Procedure