ID professor Laura Forlano comments on the use of algorithms in medical devices
By Kristin Gecan
In “The Danger of Intimate Algorithms,” ID Associate Professor Laura Forlano recounts her own experience with sleep deprivation while using Medtronic’s MiniMed 670G, an automatic insulin pump and sensor system. She points to how a current lack of checks and balances makes this situation possible.
“Unlike drug makers,” Forlano writes, “Companies that make medical devices are not required to conduct clinical trials in order to evaluate the side effects of these devices prior to marketing and selling them.”
At Public Books, she calls for change:
As companies design the next generation of “smart” medical devices, the government must require them to more seriously consider the social, cultural, and psychological impacts of their inventions as potential risks. In this case, there is no point in fixing the body at the expense of degrading the mind.
And, metaphorically, if you cannot sleep, you cannot dream. If we are to reimagine our algorithmic systems as responsible innovations that serve to support liberatory and just societies, we must have the capacity to dream.
Read on to find three suggestions she makes to promote agency and equity as algorithmic systems become ever more embedded in our society.
“The relevance of documenting injustice and dehumanization at the intersection of artificial intelligence, design, disability, and health care,” Forlano later adds, “is ever more urgent if we are to craft more equitable futures.”
Celebrating the release of Bauhaus Futures
Want to build a sustainable future? Study designers, Nature Sustainability report says
ID professor Tom MacTavish part of team recognized with IEEE Milestone Award