There is an adage, among several million adages that form the soft cliché base of human experience and editorial writing, that states that we learn best from our mistakes unless our mistakes result in death.
The July 4 fireworks debacle in San Diego would be a nice "for example" example: a tremendous build up of hype for a show that would be "bigger and more intense than in years past" followed by something that was not.
While we can all watch the video online and chuckle at the giant plume of white light that lasted just 15 seconds, laugh a little harder at "premature ignition" (15 seconds is better than zero seconds, right?), take a moment to feel sad for the children who were expecting something spectacular, then shrug and hope that it doesn’t happen next year, we’d miss a key point of the story.
This story, while funny, has lurking beneath it a dark moral. Our "for example" example is for the wrong cliché. We should rather think of this in terms of a saying my father taught me: There was never a horse that couldn’t be rode; there was never a cowboy that couldn’t be thrown.
Or, even better: sh*t happens.
The homepage for the "Port of San Diego Big Bay Boom" currently has a post that claims all preliminary tests went without a single problem. August Santore, a spokesman for Garden State Fireworks, claimed on CNN that "there was nothing in the pyrotechnics that went wrong — it was the electronics."
There was no unheeded warning, no devious employee sabotage, no vast anti-American conspiracy to ruin the show. As far as we know (for now), it just didn’t work correctly.
Of course, we’re just talking about fireworks. The Declaration was actually signed on the third, and historically speaking, Independence Day celebrations tended to center on rioting rather than watching fireworks — so no big deal. However, this is indicative of an increasing belief in the infallibility of modern technology that is verging on hubris.
You may recall that not too long ago the University of Nebraska experienced a massive breach of its digital student records — grades, bank account numbers, Social Security numbers, or OSU and Hawaii in 2010, or UCLA in 2006. Systems thought to be safe and well-monitored were breached via complex and dedicated attacks.
This isn’t to say that all technology is bad or that things would’ve been better in San Diego if guys were running around with lighters and better in Lincoln if they kept all their records in file cabinets. My birth is due to advances in medical science, and I make use of innumerable gadgets and devices that are simply marvels. However, the rhetoric of technophiles and futurists, in their ads, trade shows, online debates, belies a conviction that these clouds and tablets and whatever else, by their abstraction and interconnectivity, can overcome any limitations.
We’re building a "smarter planet." Yet our devices are made of finite materials, bound to the Earth despite our wishes, and trapped in a geologic history of inevitable decay. The first humanoid to discover fire believed it had saved humanity from the cold until it rained; and, indeed, in our headlong rush to go digital, we proceed with the same certainty of invulnerability as did Franz Reichelt when he jumped from the Eiffel Tower wearing a canopy of silk sheets 100 years ago.
We can remove human error; we cannot remove error. Mineral deposits deplete, power grids fail, files simply vanish, hackers get bored, flash drives go missing, and sometimes fireworks all go off at once.
Jesse Marks