Matches in DBpedia 2014 for { <http://dbpedia.org/resource/System_accident> ?p ?o. }
Showing items 1 to 31 of
31
with 100 items per page.
- System_accident abstract "A system accident is an "unanticipated interaction of multiple failures" in a complex system. This complexity can either be technological or organizational, and often has elements of both. A system accident can be very easy to see in hindsight, but very difficult to see in foresight. Ahead of time, there are simply too many possible action pathways.These accidents often resemble Rube Goldberg devices in the way that small errors of judgment, flaws in technology, and insignificant damages combine to form an emergent disaster. System accidents were described in 1984 by Charles Perrow, who termed them "normal accidents", as having two main characteristics: interactive complexity and tight coupling. James T. Reason extended this approach with human reliability and the Swiss cheese model, now widely accepted in aviation safety and healthcare.Once an enterprise passes a certain point in size, with many employees, specialization, backup systems, double-checking, detailed manuals, and formal communication, employees can all too easily recourse to protocol, habit, and "being right." Rather like attempting to watch a complicated movie in a language one is unfamiliar with, the narrative thread of what is going on can be lost. And other phenomena, such as groupthink, can also be occurring at the same time, for real-world accidents almost always have multiple causes, and not just the single cause that could have prevented the accident at the very last minute. In particular, it is a mark of a dysfunctional organization to simply blame the last person who touched something.The processes of formalized organizations are often largely opaque. Perrow call this "incomprehensibility."There is an aspect of an animal devouring its tail, in that more formality and effort to get it just right can actually make the situation worse. For example, the more organizational rigmarole involved in adjusting to changing conditions, the more employees will delay reporting changing conditions. And the more emphasis on formality, the less likely it is that employees and managers will engage in real communication. And new rules can actually make it worse, both by adding a new additional layer of complexity and by telling employees once again that they are not to think, but are instead simply to mechanically follow rules.Regarding the May 1996 crash of Valujet (AirTran) in the Florida Everglades and the lack of interplay between theory and practice, William Langewiesche writes, "Such pretend realities extend even into the most self-consciously progressive large organizations, with their attempts to formalize informality, to deregulate the workplace, to share profits and responsibilities, to respect the integrity and initiative of the individual. The systems work in principle, and usually in practice as well, but the two may have little to do with each other. Paperwork floats free of the ground and obscures the murky workplaces where, in the confusion of real life, system accidents are born."".
- System_accident wikiPageExternalLink perrow.PDF.
- System_accident wikiPageID "6759067".
- System_accident wikiPageRevisionID "593119103".
- System_accident bot "H3llBot".
- System_accident date "October 2010".
- System_accident hasPhotoCollection System_accident.
- System_accident subject Category:Failure.
- System_accident subject Category:Safety_engineering.
- System_accident subject Category:Systems_engineering.
- System_accident type Abstraction100002137.
- System_accident type Accident107301336.
- System_accident type Accidents.
- System_accident type AviationAccidentsAndIncidents.
- System_accident type Event100029378.
- System_accident type Happening107283608.
- System_accident type Misfortune107304852.
- System_accident type Mishap107314427.
- System_accident type PsychologicalFeature100023100.
- System_accident type Trouble107289014.
- System_accident type YagoPermanentlyLocatedEntity.
- System_accident comment "A system accident is an "unanticipated interaction of multiple failures" in a complex system. This complexity can either be technological or organizational, and often has elements of both. A system accident can be very easy to see in hindsight, but very difficult to see in foresight.".
- System_accident label "Systeemongeval".
- System_accident label "System accident".
- System_accident sameAs Systeemongeval.
- System_accident sameAs m.0gmhpy.
- System_accident sameAs Q2328179.
- System_accident sameAs Q2328179.
- System_accident sameAs System_accident.
- System_accident wasDerivedFrom System_accident?oldid=593119103.
- System_accident isPrimaryTopicOf System_accident.