From Covid exposure notification apps to AI designed to identify early warning signs of infection, the technological response to the pandemic has been largely underwhelming.
The reasons include failure to to think about the public health implications of the apps, especially growing public concerns about privacy - and tech companies abuse of personal information. In addition, there has been a focus on the potential financial windfall, particularly in AI, which led to a surfeit of vaporware, rather than on getting the solution right. The result of these mistakes has led to suboptimal performance of ostensible tech 'solutions' which has exacerbated growing distrust of the industry. Rethinking the marketing approach to new technologies in healthcare will be crucial to gaining trust, market share and profits. JL
Lindsay Muscato reports in MIT Technology Review:
Exposure notification apps were developed at the start of the pandemic, as technologists raced to help slow the spread of Covid. The most common system was developed jointly by Google and Apple, and dozens of apps around the world used it. The apps run on smartphones. Many (had) low numbers of downloads. The user is not an engineer at Google. It’s your uncle. It’s your kid sister. Understanding how people use thing is not something that engineers are trained to do—it’s something public health people or social scientists do, and those people have to be an integral part of the solution.Exposure notification apps were developed at the start of the pandemic, as technologists raced to help slow the spread of covid. The most common system was developed jointly by Google and Apple, and dozens of apps around the world were built using it—MIT Technology Review spent much of 2020 tracking them. The apps, which run on ordinary smartphones and rely on Bluetooth signals to operate, have weathered plenty of criticism over privacy worries and tech glitches. Many in the US have struggled with low numbers of downloads, while the UK recently had the opposite problem as people were deluged with alerts.
Now we’re looking back at how this technology rolled out, especially because it might offer lessons for the next phase of pandemic tech.
Susan Landau, a Tufts University professor in cybersecurity and computer science, is the author of People Count, a book on how and why contact tracing apps were built. She also published an essay in Science last week arguing that new technology to support public health should be thoroughly vetted for ways that it might add to unfairness and inequities already embedded in society.
“The pandemic will not be the last humans face,” Landau writes, calling for societies to “use and build tools and supporting health care policy” that will protect people’s rights, health, and safety and enable greater health-care equity.
This interview has been condensed and edited for clarity.
What have we learned since the rollout of covid apps, especially about how they could have worked differently or better?
The technologists who worked on the apps were really careful about making sure to talk to epidemiologists. What they probably didn’t think about enough was: These apps are going to change who gets notified about being potentially exposed to covid. They are going to change the delivery of [public health] services. That’s the conversation that didn’t happen.
For example, if I received an exposure notification last year, I would call my doctor, who’d say, “I want you to get tested for covid.” Maybe I would isolate myself in my bedroom, and my husband would bring me food. Maybe I wouldn’t go to the supermarket. But other than that, not much would change for me. I don’t drive a bus. I’m not a food service worker. For those people, getting an exposure notification is really different. You need to have social services to help support them, which is something public health knows about.
In Switzerland, if you get an exposure notification, and if the state says “Yeah, you need to quarantine,” they will ask, “What’s your job? Can you work from home?” And if you say no, the state will come in with some financial support to stay home. That’s putting in social infrastructure to support the exposure notification. Most places did not—the US, for example.
Epidemiologists study how disease spreads. Public health [experts] look at how we take care of people, and they have a different role.
Are there other ways that the apps could have been designed differently? What would have made them more useful?
I think there’s certainly an argument for having 10% of the apps actually collect location, to be used only for medical purposes to understand the spread of the disease. When I talked to epidemiologists back in May and June 2020, they would say, “But if I can’t tell where it’s spreading, I’m losing what I need to know.” That’s a governance issue by Google and Apple.
There’s also the issue of how efficacious this is. That ties back in with the equity issue. I live in a somewhat rural area, and the closest house to me is several hundred feet away. I’m not going to get a Bluetooth signal from somebody else’s phone that results in an exposure notification. If my bedroom happens to be right against the bedroom of the apartment next door, I could get a whole bunch of exposure notifications if the person next door is ill—the signal can go through wood walls.
Why did privacy become so important to the designers of contact tracing apps?
Where you’ve been is really revelatory because it shows things like who you’ve been sleeping with, or whether you stop at the bar after work. It shows whether you go to the church on Thursdays at seven but you don’t ever go to the church any other time, and it turns out Alcoholics Anonymous meets at the church then. For human rights workers and journalists, it’s obvious that tracking who they’ve been with is very dangerous, because it exposes their sources. But even for the rest of us, who you spend time with—the proximity of people—is a very private thing.
“The end user is not an engineer… it's your uncle. It's your kid sister. And you want to have people who understand how people use things.”
Other countries use a protocol that includes more location tracking—Singapore, for example.
Singapore said, “We’re not going to use your data for other things.” Then they changed it, and they’re using it for law enforcement purposes. And the app, which started out as voluntary, is now needed to get into office buildings, schools, and so on. There is no choice but for the government to know who you’re spending time with.
I’m curious about your thoughts on some bigger lessons for building public technology in a crisis.
I work in cybersecurity, and in that field it took us a really long time to understand that there’s a user at the other end, and the user is not an engineer sitting at Sun Microsystems or Google in the security group. It’s your uncle. It’s your kid sister. And you want to have people who understand how people use things. But it’s not something that engineers are trained to do—it’s something that the public health people or the social scientists do, and those people have to be an integral part of the solution.
I want a public health person to say to me, “This population is going to react to the app this way.” For example, the Cambodian population that’s in the United States—many of them were traumatized by government. They’re going to respond one way. The immigrant population that comes from India may respond in a different way. In my book, I talk about the Apache reservation in eastern Arizona, which took into account the social factor. It’s a public health measure—not a contact tracing measure—to ask about someone’s other set of grandparents.
Digital vaccine apps and credentials are now rolling out in a wide array of states and countries, and being required by private entities. For those to work, who should be in the room as they’re designed?
You want the technologists who have thought about identity management and people who think about privacy. How do you reveal one piece of information without revealing everything else?
And you want to get people who really appreciate the privacy issues of disease. What jumps to mind is the epidemiologists and contact tracers who worked with AIDS, which was really an explosive issue back in the 1980s. You want them because they understand public health, and they really understand the importance of the privacy issue. They get it in their gut.
It’s getting smart people from both sides in the room. They have to be smart, because it’s hard to understand somebody else’s language. And both groups have to understand what the other is saying, but they also have to be confident enough that they’re willing to ask lots of questions. It’s the really understanding that’s hard.
2 comments:
Thank you for posting such a great article. Keep it up mate.
YSR Pension Kanuka New List 2021 - sspensions.ap.gov.in Pensions Disbursement Report, Scheme Wise or Area Wise Report Online
Thank you for posting such a great article. Keep it up mate.
eSamajKalyan Registration 2021 - esamajkalyan.gujarat.gov.in | Online Registration/Employee Login | Application Status Online Check
Post a Comment