May 24, 2018
In my family’s car you’ll find a booster seat, a toddler’s car seat, and a full-fledged, rear-facing infant car seat. That’s because it’s the law to strap in your little ones, and it’s also the law for any adult in California to wear a seat belt. That said, an AC Transit bus to San Francisco doesn’t have seat belts, and when my five-year-old heads to camp on the bus this summer, he also won’t have a seat belt.
This is an example of how, despite the safety precautions we’ve legislated, there are still, clearly, some holes—policy holes, if you will. And technology has lately been stumbling into these holes with alarming frequency. As a result, it’s time to consider using what I call “technology seat belts”—or, practices that may help us address technology’s consequences.
But first: some context. To an extent, we all misunderstand what technology is. Technology isn’t cryptocurrency, self-driving cars, artificial intelligence, or even iPhones, per se. It is, but it also isn’t. By definition, technology is, as Dictionary.com puts it, the “creation and use of technical means” and their impact on “life, society, and the environment.” Likewise, as Sesame Street has eloquently, and perhaps surprisingly, explained, technology, in its most basic form, is a tool that helps people accomplish something. (On the show, after showing viewers items ranging from e-readers to tablets, the actress grabs a backpack, which she notes as technology—a tool to help you carry items.)
In short, technology is innovation, and innovation is generally good. But why do these usually useful tools go astray? And what should we do about it?
At least in part, it’s due to a lack of civic planning, our neglect when considering new scenarios and innovations. But it’s also because of a broader inability to be sufficiently proactive about the challenges technology can create, and its creators’ reluctance to take full responsibility for what they create. Put another way, there’s a tendency to put the tool on the table, step back, and see what happens, which allows us to wash our hands of any possible impact—apart from profits and losses.
For instance, after the Cambridge Analytica data breach, Mark Zuckerberg shared in an interview that he would’ve never expected, when creating “the facebook” in his dorm room, that this tool would be compromised, or that the breach would impact a presidential election. Now, after the recent hearings, some wonder whether he and other CEOs merely ask for forgiveness, rather than for permission.
The speed and scale of our innovations mean that we can no longer afford to take this type of wait-and-see approach. Regardless of whether our efforts are negligent or nefarious, we need policy before the next great fixers say that they hadn’t anticipated fill in your choice of disaster here.
Forward-thinking policy, counter-steps, and even protections bring to mind regulations, and regulations are generally thought to be where creativity goes to die. But seat belts don’t impact a vehicle’s speed; they aren’t brakes. Rather, they simply keep you from being hurled through the windshield. Today, more than ever, technological tinkering requires greater forethought about potential consequences—to prevent them in the first place, yes, but also to lessen the consequences of inevitable oversight.
But technology seat belts aren’t only about mindfulness. They’re also about crafting solutions for our solutions. We’ve seen the need for this before and have acted. Take when dams were preventing salmon from migrating, and how we created fish steps to allow fish literally to leap over the obstacle.
What else might technology seat belts look like? I’ll offer a few possibilities, with the hope that these ideas will eventually be dwarfed by other ideas they might spur.
Pre-patent or Licensing Requirements
Creators of new innovations should be required to indicate whom and what their projects might impact. This also ought to include a set of proactive steps they’ll take to address these impacts. Akin to a business plan, a new device or tool should be accompanied by a “keep it from hitting the fan” plan. Tethered to these should be safety requirements for mitigating potential impacts—think of it as a sort of bright yellow tape for the ideas themselves.
With some innovations, legislation and practices should be calibrated according to their effect on the gini coefficient—a wonky term for the measure of income inequality. If the ability to access information or thrive economically is indelibly linked to a technological resource, then unequal access only intensifies disparity. (Think of how internet access has become such a critical tool.) As tools become more important for our everyday needs, it’s key to address equitable access. Local government offices of innovation and other policy bodies should therefore scour existing policy for holes (similar to the example above of no seat belts on buses, even though it’s the law to buckle up), with a focus on where changes that result from a solution might lead to greater inequity.
Pre-emptive Public Service Campaigns
Several compelling public service announcements ran after thousands of automobile accidents were caused while texting while driving. Ideally, these public campaigns would get ahead of these issues. A body could take time before launching a new tool to think about, for example, what we shouldn’t do at the same time—like writing emails and driving a car. We could then launch these ads before litigation requires them, as was the case for cigarette companies, or before the loss of life elicits behavior change.
In all these examples, you might notice that there’s still a general fuzziness around the notion of technology seat belts. Well, that’s due largely to a lack of true technology seat belts. There’s nothing (yet!) compelling this type of preventative action. Steps that look ahead are voluntary, and a real seat belt doesn’t ask you to think bracing thoughts as you drive.
Hence, it’s important, too, to prod the public to participate in what happens—and, more specifically, to create a little mistrust of the notion that, “It exists, so it’s good.” Remember that Steve Jobs was a low-tech parent, limiting the influence of technology on his kids. He and others had a healthy skepticism of the integration of technology—and so should we.
I’m not saying that there’s an easy, flawless solution. As humans, our tinkering will likely always generate errors, hiccups, and malfunctions. As the writer Evgeny Morozov suggests, imperfection is baked into our humanity. Even so, we must strive—though we’ll likely land far short of perfection.
A few years ago, when my husband and I were discussing our plans for household budgeting, we considered using an excel spreadsheet, and even looked at a few online tools to track our budgeting as our inputs and outputs shifted. (Spoiler alert: More kids equal a lot more outputs!) One option was an online budgeting tool through which you log all details about your bank accounts and bills—including your account numbers. My internal distrust alarm blared: There’s no way that could be a good idea. Fast-forward to a few months ago, when there was a data breach for just such a company!
That alarm needs to sound for all of us as we consider what’s ahead—an alarm not necessarily for the innovation, but for how it’s incorporated into our day-to-day lives.