Velocity NY 2013: Richard Cook, "Resilience In Complex Adaptive Systems"

By: O'Reilly

168   1   12259

Uploaded on 10/15/2013

Resilience In Complex Adaptive Systems: Operating At The Edge Of Failure

Systems seem to run at the very edge of failure much of the time. The combination of high workload, limited resources, pressure for additional features and capability, and inherent software, hardware, and network fragility is a noxious kettle of stuff always about to boil over in the form of outages, degraded response, or functional breakdowns. For insiders the surprising thing about our systems is not that they fail so often but that they fail so rarely! This good performance in the face of adverse conditions is called resilience. An important conclusion from resilience studies is that it depends critically on human operators and their ability to anticipate and monitor the system, react to threats, and sacrifice some goals to protect others. This talk will introduce resilience and a model of system dynamics useful in analyzing failed and successful event management and offer an explanation for why our systems run at the edge of failure.

Comments (4):

By sunils34    2017-09-20

Richard Cook's writing and talks on systems have hugely shaped my thinking on systems.

For those interested in more, he gave a highly entertaining talk on system resiliency here and introduces a helpful model.

I also wrote a post summarizing how I think about Rasmussen’s model in startups and web applications.

Original Thread

By pdkl95    2017-09-20

> the staff had become “de-sensitized” to the risk of a serious accident.

Yet again serious problems happen due to the Normalization Of Deviance. We really need to find a way to create working conditions that encourage correcting problematic situations immediately before the behavior becomes normalized.

Regarding the management/other problems... I encourage everyone in ever industry that is ever involved with safety (i.e. most industries) to see Richard Cook's short talk about "Resilience in Complex Adaptive Systems"[1].


Original Thread

By pdkl95    2017-09-20

I should have used more specific language; I'm not trying to argue that short selling is "bad". The benefits - such as dampening a crashing market - are large enough to accommodate some risk. I'm primarily addressing the claim that, "There's absolutely nothing wrong with short selling". Ignoring small, acceptable risks is normalizing deviance[1].


Original Thread

By pdkl95    2017-12-05

> both individual and group ethics have to fail

However, they don't have to fail at the same time or in a single step. This type of problem can creep in slowly over years as the "normalization of deviance"[1].

Richard Cook presented[2] a very useful model for how this type of problem creeps into complex system. The pressure from economic and workload concerns never goes away, so unless there is a proactive, explicit counter force, way to push back against that force, the system will inevitably be pushed toward failure. Therefor it's important to stop problems early when they are small. The magnitude of the counter force increases rapidly as behavior becomes increasingly deviant.



Original Thread

Submit Your Video

If you have some great dev videos to share, please fill out this form.