The fact that computerization now runs every device means that those who control those computers, control the machine, as well as their use. The issue is whether, given these control issues, is whether consumers are being overcharged for what amounts to a partial purchase and possibly even a lease rather than an actual purchase in the historical sense. JL
Cory Doctorow comments in the Electronic Frontier Foundation:
Every device has a computer, from cars to thermostats to lightbulbs. Manufacturers realized that by shellacking digital lock around these devices, they could enforce high-profit restrictions. First it was phones that would only run software from the manufacturer. Then it was cars that could only be diagnosed and repaired by authorized service centers that only used official parts. Then it was thermostats and lightbulbs, tractors and voting machines.And medical devices.
As networked computers disappear into our bodies, working their way into hearing aids, pacemakers, and prostheses, information security has never been more urgent -- or personal. A networked body needs its computers to work well, and fail even better.
Graceful failure is the design goal of all critical systems. Nothing will ever work perfectly, so when things go wrong, you want to be sure that the damage is contained, and that the public has a chance to learn from past mistakes.
That's why EFF has just filed comments with the FDA in an open docket on cyber-security guidelines for medical systems, letting the agency know about the obstacles that a species of copyright law -- yes, copyright law! -- has put in the way of medical safety.
The problem is Section 1201 of the Digital Millennium Copyright Act, which prohibits tampering with "effective means of access control" that restricted copyrighted works. The law was a creature of the entertainment industry, which saw an opportunity to create new business models that transferred value from their customers to their shareholders. CDs didn't have digital locks, so was easy to convert the music you bought on CD to play on your digital home stereo, phone, and car. DVDs have digital locks, so all you can legally do with the movies you buy on DVD is watch them. If you want to get at that latent value in your discs -- the value of watching a movie on a phone, or backing it up in case you scratch your disc, for example -- you have to buy the movie again.
To keep these business models intact, large content holders sued and threatened security researchers who disclosed flaws in systems with digital locks, arguing that sharing research that required circumvention violated the DMCA. As a result, systems with digital locks became a no-go zone for security research, meaning that their flaws fester for longer before being brought to light and fixed.
And then it got weird.
Increasingly, every machine and device has a computer inside it, from cars to thermostats to fancy new lightbulbs. Manufacturers realized that merely by shellacking the minimum plausible digital lock around these devices, they could use the DMCA to enforce the same high-profit restrictions that had been the purview of the entertainment industry until then.
First it was phones that would only run software from the manufacturer's app store. Then it was cars that could only be diagnosed and repaired by authorized service centers that only used the manufacturer's official, high-priced replacement parts. Then it was everywhere: thermostats and lightbulbs, yes, and tractors and voting machines, too.
And, of course, medical devices.
Manufacturers who use digital locks to restrict the configurations of their devices get a lot of commercial benefit. They can force doctor's offices to pay recurring license fees for the diagnostic software that works with these gadgets. They can restrict access to service and even consumables -- why allow just anyone's insulin to be installed on your pump when the inkjet printer people have demonstrated a way to charge vintage Champagne prices for something that costs pennies a gallon?
But a profit motive that might conflict with users' best interests isn't the worst problem. The great danger is safety. Medical implants are increasingly equipped with wireless interfaces, because:
a) they're cheap; and
b) it's hard to attach a USB cable to a device that's been implanted in your chest cavity.
That means that bugs in medical implants can be exploited over their wireless interfaces, too. For example: lethal shocks from implanted pacemakers and defibrillators. It was not for nothing that former VP Dick Cheney had the wireless interface on his pacemaker deactivated (future software updates for Mr Cheney's heart-monitor will thus involve general anaesthesia, a scalpel, and a rib-spreader).
However you feel about copyright law, everyone should be able to agree that copyright shouldn't get in the way of testing the software in your hearing aid, pacemaker, insulin pump, or prosthetic limb to look for safety risks (or privacy risks, for that matter). Implantees need to know the truth about the reliability of the technology they trust their lives to.
That's why today, EFF asked the FDA to require manufacturers to promise never to use the DMCA to attack security research, as a condition of certifying their devices. This would go a long way to protecting patients from manufacturers who might otherwise use copyright law to suppress the truth about their devices' shortcomings. What's more, it's an approach that other groups have signed up for, as part of the normal process of standardization.
We think Congress should modify the DMCA to make it clear that it doesn't apply to devices that have no nexus with copyright infringement, but patients can't wait for this long-overdue reform. In the meantime, agencies like the FDA have a role to play in keeping patients safe from devices that work well, but fail badly.
0 comments:
Post a Comment