Paul Dourish reports in The Atlantic:
The world is more interconnected than at any point in history, and yet it appears that the only reliable way to get a file quickly from one computer to another is to use a flash drive—to rely on what computer enthusiasts used to jokingly refer to as “sneakernet.” Pretty much any laptop you buy has three or four different high-speed networking technologies built into it, and yet the flash drive beats them all as a way to share files. It's the lizard brain of your computer.
Once upon a time, my house was littered with cat 5 cables—those oversized-phone-cord-like things that connect computers to Ethernet networks. Challenged by family and visitors, I once argued that you could never have too many cat 5 cables. I came to regret that, of course, as wireless networks made the cables obsolete.At different moments, different unremarkable technical objects seem to evoke that same feeling: that one can’t have too many. These days, the things that seem to turn up all over the place—lurking in pockets of different bags, filling drawers, and junk boxes, dropped down the back of desks—are USB flash drives.They’re everywhere. There is almost certainly one within ten feet of you right now. I seem to acquire them unceasingly—they’re handed out as promotional tchotchkes, used to provide meeting minutes and conference proceedings, and presented in all sorts of shapes, sizes, and configurations. They have become inescapable elements of the contemporary technological landscape.
There are two deep ironies in this profusion. The first is that the world is more interconnected and more irradiated than at any point in history, and yet it appears that the only reliable way to get a file quickly from one computer to another is to use a flash drive—to rely on what computer enthusiasts used to jokingly refer to as “sneakernet.” Pretty much any laptop you buy has three or four different high-speed networking technologies built into it, and yet the flash drive beats them all as a way to share files. If you don’t believe me, just ask two people at your next meeting to connect their computers together and copy a file from one to the other. Really, it’s hilarious. Transferring files over thousands of miles is easy; moving them two feet is almost impossibly difficult. (Even the “AirDrop” feature in newer versions of Apple’s operating systems—perhaps the best job anyone’s done so far of solving this problem—is strangely finicky and requires you to have just the right devices.)The second irony, given how overwhelming the speedy pace of technological advancement can feel, is how primitive the technology on which USB flash drives rely actually is. The challenge posed by the flash drive is to find a way to work seamlessly and easily with every computer. Its solution is a technology known as the “FAT filesystem,” a system—named for its primary data structure, the File Allocation Table—that was developed as a means to manage early floppy disk storage units. Pretty much the simplest imaginable mechanism for representing data on a disk, it was speedily developed and deployed in Microsoft’s almost-ubiquitous BASIC programming system in 1977.Although it has long since been displaced by more advanced technologies, those other technologies have frequently incorporated a version of FAT into their DNA. Some version of that same FAT filesystem has lived on, locked away inside the more advanced systems that allow for the use of today’s much larger, speedier storage technologies. When people rely upon the FAT filesystem, they’re plugging into an evolutionary throwback, like some kind of vestigial tail. It’s the lizard brain of your computer. The flash drive exposes the great lie of technological progress, which is the idea that things are ever really left behind. It’s not just that an obsolete technology from the year of Saturday Night Fever still lurks unseen in the dank corners of a shiny new MacBook; it’s that it’s something that is relied upon regularly. The technology historian Thomas Hughes calls these types of devices “reverse salients”—those things that interrupt and disturb the forward movement of technology. They reveal the ugly truth that lies behind each slick new presentation from Google, Apple, or Microsoft: Technical systems are cobbled together from left-over pieces, digital Frankenstein’s monsters in which spare parts and leftovers are awkwardly sutured together and pressed into service. It turns out that the emblems of the technological future are much more awkwardly bound to the past than it’s comfortable to admit.But there is perhaps an even more pervasive and corrosive idea that the flash drive helps unsettle, which is that of the breakneck pace of technological change. Both popular debate and technical discussion are regularly premised on the notion that digital technology develops at a dizzying pace while we poor humans plod along in the evolutionary slow lane. This is the idea that echoes through every startup pitch about “disruption,” and every exhortation to accept the “inevitability” of technology-induced changed. In these stories, to be digital is to be moving ahead, ever-changing, always adapting and morphing into the Next Great Thing. To be human, by contrast, is to be loth to change, slow to adapt, and poised for irrelevance. But the dizzying pace of technological change takes on a rather different character in light of a component like the FAT file system from 1977 that somehow can’t be left behind.But never mind the disco tech—computer systems are stuffed full of ideas, technologies, and practices that seem, on closer examination, to be well past their sell-by dates, from the Mac OS Terminal windows whose widths are designed to precisely accommodate the data on IBM punched cards, to the hierarchical logic of files and folders that carry us back to the invention of vertical filing in the opening years of the 19th century. As a culture, Americans, at least, seem so committed to the idea that technology is fast, gleaming, and new that we become deeply uncomfortable when presented with the alternative. When a student of mine professed an interest in studying how technologies age and obsolesce, a colleague took her aside to ask why someone with her whole career ahead of her would want to study something like that. His concern was so pressing, so urgent, and so visceral as to imply not just an anxiety about job prospects but a fear for her soul. Ironically enough, her study was actually of interstellar spacecraft. A story can be told about these as monuments of engineering achievement, and sites of cutting-edge science and technology. People like that story. But when the story is told of the spacecraft launched in the 1980s based on 1970s designs that incorporate1960s parts based on 1950s technologies, they start to squirm. It is somehow an unpleasant thought to dwell upon.Perhaps it is because digital systems so perfectly seem to evoke the spirit of the new. Perhaps it is because digital technology is so firmly tied to the tired yet pervasive rhetoric of disruption and revolution. Or perhaps it is simply that people confuse the volume of technological change for its pace. Whatever the cause, it may be time to supplement the talk of nanocycles and microseconds with talk of years and decades, because the ancient history of technology is alive in the latest tools and the newest apps. The developers of the first stored-program computers might be amazed at the capacities of today’s computers, but they would be utterly at home with the principles of their operation. Whatever new features appear in the next version of Windows, OS X, Android, or iOS, you can be sure that they will be living alongside the lizard brain of FAT and any number of anachronistic artifacts of digital years gone by.
0 comments:
Post a Comment