My generation, the children of West Indian immigrants, didn’t learn to look kindly on imperfection, even when it was all we had.
My mother’s mother trained perfectionism into her children by scrunching up inattentively ironed shirts and requiring the poor unfortunates to iron them again. My mother once punished me for not starting a summer project, an assignment from a teacher I’d never see again at a school I’d already left: many things were possible in this world, but her child missing an assignment was not among them even if the assignment was obsolete.
Perfection, defined as a state of flawlessness, was a good.
In the 1980s, sociologists like Charles Perrow pored over institutional documents and concluded that however uniquely horrifying, glitches were not exceptional events. They were “normal” accidents; things that were bound to go wrong when people started doing and applying science and engineering on a complicated scale. In Perrow’s account, accidents were normal because they happened with enough regularity that any institution of a sufficiently large scale could practically count on one. They happened through tiny errors in routine activities that became magnified with tragic consequences. To ignore the probability of accidents, Perrow suggested, was to live in a fantasy world.” —Joanna Radin, December 12, 2016
Accounting for errors as you plan projects isn’t about being sloppy. It’s about being wise.
Perhaps you underestimate the time it takes you to complete a task. Maybe you overestimate the contributions other team members can offer. Something comes up, and whatever the error, you have to adjust.
As Radin explains, accidents and “glitches” allow us, if we’re willing, to learn and reshape our context: “The glitch provides us with an opportunity to make a bridge between what doesn’t make sense and what could turn into knowledge… an opportunity to begin considering how things could be otherwise.”
Read more at The New Inquiry.