force on earth can get everything to stay in balance all the time.
To insist on perfection is to shut the whole thing off."
R. Chiles, from "Inviting Disaster"
always will, so mistakes can be crucial lessons.
the History of Petroleum Geology Forum at the AAPG Annual Meeting
in Dallas will explore "Lessons Learned from Failure."
packs a one-two wallop, starting with a panel discussion of "Bypassed
Pays: Opportunities for Large Reserve Additions."
includes Sidney Powers medalist and AAPG honorary member Robert
M. Sneider, with Larry D. Meckel of L.D. Meckel and Co. in Denver,
and David G. Smith from Burlington Resources in Calgary.
R. Chiles will describe "Human Factors and System Fractures: Lessons
from Oil & Gas Industrial Disasters."
is author of the book "Inviting Disaster: Lessons from the Edge
of Technology," published in 2001.
Ginsburg and Marlan Downey will preside at the forum. Its failure
topic followed naturally from a previous session, said Ginsburg,
professor of marine geology at the University of Miami.
one, we had done lessons from successes," he said. "We were casting
around for an idea and we thought, 'What about the mistakes?'"
sees company culture as an important part of the failure story.
might say, 'That kind of sand is always tight,' or 'Never look in
chalk,'" he noted.
Chiles can bring an abundance of detail and a unique perspective
to that aspect of lessons-from-mistakes.
Disaster,' he included a whole series of examples that were attributed
to judgment or culture, which I thought was the most interesting
part of the book," Ginsburg said.
investigated dozens and dozens of disasters, large and small, in
offer lessons, but with true disasters "the learning is deepest,"
he said. "That's one requirement -- visibility, memorability."
maintains a Web site of updates on disasters and subsequent investigations,
encouraging additions and comments at www.invitingdisaster.com.
book opens on the morning of Feb. 14, 1982, on the offshore drilling
rig Ocean Ranger.
no surprise ending to the story. The $100 million Ocean Ranger sank
in a heavy storm in the North Atlantic on that Valentine's Day,
killing its crew.
it described as equivalent in Canada to our loss of the (space shuttle)
Challenger," Chiles noted.
the book, he also examines the fate of the Piper Alpha rig in the
North Sea, the world's worst offshore disaster.
my reason for opening the book with (the Ocean Ranger) is that Americans
didn't get the full message," he explained -- mainly because the
United States never had an offshore rig collapse of similar scale.
for the workers in the offshore drilling industry, a lot of the
Canadian and UK directives did filter through the whole industry
worldwide," he added.
explores both the human and mechanical problems that resulted in
the failure of the Ocean Ranger.
concern that came up again and again was the problem of split authority
on the rig and onshore.
"At a minimum,
you had three bosses," he said.
least twice, the Ocean Ranger had imbalance problems," the latest
only eight days before the rig's collapse, he observed.
would be even worse if their lessons go unheeded, a point Chiles
makes in his book.
to be willing to learn, and you have to have a willingness to change.
I think the deepsea drilling industry has been willing to change,"
What Went Wrong?
will draw on specific examples to illustrate possibilities from
known example might be the Elmworth Field in Canada's Alberta Basin,
where 61 dry or uneconomic wells were drilled into Lower Cretaceous
As it turned
out, those attempts missed 10 principal reservoir units that now
have more than 2,000 producing wells.
Not a lack
look at the data, all the data's there to make the right decision,"
the problem stems from the nature of department jurisdictions, he
usually don't put all the pieces together," Sneider said.
in a large enough organization -- and it doesn't have to be very
large -- that they don't tie together all the pieces, which include
both engineering and petrophysics," he added.
mistake comes from failure to examine the entire exploration picture,
dealing with rocks and fluids," he said. "We do a good job with
the rocks, but we don't take a very good look at the fluids."
the solution comes from involving the right technical expertise.
Not a help,
according to Sneider.
managers on an evaluation team, the more data doesn't get integrated
into the wildcat wells," he said.
overlooked pays, Sneider will talk less about the pays, and more
about the "overlooking."
a complicated problem," he said. "It's an organizational problem."
people think of disasters from internal organization problems, they
think of NASA.
looked into several of the space agency's failures.
was safety conscious leading up to the Columbia disaster," he said,
"but it had put itself in a narrow view of what to worry about."
NASA engineers saw themselves as dealing with constrained resources,
they concentrated on the "big things" that might go wrong, according
of that was that things that didn't fall into the red zone were
managed by downgrading the potential risk," he said.
disasters, Chiles could usually find precursors that indicated the
of those precursors, however, is open to debate, he admitted.
say, 'It was just a precursor afterward.' The criticism is, it was
just a precursor in retrospect. If it was one of a thousand things
that happened, it might just be noise," Chiles said.
favors building up benchmarks to compare incidents.
a concrete set of readings, tolerances and specific requirements,
a safety auditor can point out any deviations, he said.
can identify a number of barriers to safety and reliability, including
the attitude that "testing is such a bother."
and training should reflect the real stresses of on-the-job performance,
that doesn't happen because it's harder to justify a failure that
occurs during testing, or to defend an accident that occurs during
to make the point in the book that life-and-death training can have
a terrible price to it, but it's necessary to do it," he said.
insurers don't know how to give credit for training programs and
prefer to reward companies for installing protective hardware, Chiles
is so important that insurers need to find a way to reward a company
for properly training its personnel, he said.
failure always involves, and often begins with, the human factor.
of it is empowered and alert workers -- 'alert' in the sense that
they know when something needs to be done," Chiles said.
crew is a lot safer than one that is blissfully ignorant," he added.
"They know when they are pushing the envelope."
to see workers trained to stop disasters in their tracks, to become
what he calls "crack-stoppers."
really the notion of 'system fracture.' I've been told that this
concept is a helpful addition to what's already a rich literature,"
fail "in a step-by-step way analogous to how metal cracks under
stress," according to Chiles.
a crack-stopping barrier, it will stop short of that culminating
event," he said.
any disaster, Chiles observed, "the question remains until the end:
Will somebody stop it in time?"
cases, preventative measures are ignored even when they're readily
available -- people don't take time to grab a hard hat, or they
don't want to look stupid wearing a pair of safety goggles when
they mow the lawn.
recalled touring an offshore platform, where he was instructed in
safety measures as soon as he arrived.
the tour, he failed to hold a handrail on a steep flight of steps.
He was told to hold the railings, or leave the rig.
said he thought about the warning.
the only thing that would look more foolish than clutching a handrail
would be falling downstairs and breaking his neck.
be a stupid mistake to make," Chiles said.
then on a flight of stairs, I always hold the handrail."