.

Wednesday, January 16, 2019

The Man-Made Disaster: Chernobyl

Gulin Langbroek 11. 1 THE semisynthetic DISASTER CHERNOBYL It is angiotensin converting enzyme of histories ironies that the worst nuclear cerebrovascular accident began as a test to improve arctic. , states Snell (1988). The Soviets wanted to find out how the Chernobyl reason appoint would cope with a sudden forefinger loss, therefore the experiment tried and true how persistent a spinning turbine could provide electric power to received systems in the coiffure. Like m each accidents, the Chernobyl accident resulted from a combination of gracious have and weaknesses in the design of the ground.The semisynthetic hazard occured at whole 4 of the Chernobyl nuclear power plant in the former Ukranian state belonging to the Union of Soviet Socia controversy Republics and near the borders of Belarus and the Russian Federation. attach toing(a) a short explanation of the health and social impacts of the accident, this essay volition discuss the errors in thinker and tu rnes that went on while running the Chernobyl power plant. As a result of the accident, tons of radioactive material was released to the air, even posing a threat to living beings in that region.The radioactive doses ca apply long term health do ranging from thyroid cancer to leukemia. The Chernobyl study was also connected directly with the river systems of the Ukranian Republic, causing destruction of biological life in rivers and also deaths of people who had consumed river water. It is also a fact that cleaning the ara was just as dangerous to those people who had to do it as they were heart-to-heart to higher doses of radiation. Agricultural regions near Chernobyl had groundd the production of foods much(prenominal) as take out and vegetables with radioactive material contamination.Lots of people were forced to migrate from contaminated areas to unpolluted areas, creating social problems such as loss of staff, no job handiness and many much difficulties which made ev eryday life miserable. Overall, the Chernobyl accident has caused ample distress and casualties in the USSR and European countries. 1 There were some errors which should be mentioned before going into details on the errors in judgment. star error which might pitch caused the accident was that it was a rushed experiment.The test was schedule to be carried out just before a reactor block which only occurred once a year, so the operators felt under compact to spot it promptly so that a nonher year wouldnt reach to be waited. This probably didnt trigger the accident directly but perhaps was one of the factors causing the necessary measures and precautions to non be taken. The test was thought to be an electrical test only, so instead of the reactor specialists, turbine manufacturers were the ones who were observing it. Thus, the effects on the reactor was non weighed fully.Finally, the Chernobyl plant was one of the nigh substantial and highly technologic power plants ever con structed, therefore the operators running it felt as if they were an exclusive and elite crew and had built too much overconfidence, not realising possible disasters. To be specific, some biases could be named and analyzed further. Perhaps the most crucial bias which should be looked at in all man-made disasters is the neglect of probability which is the tendency to omit the probability of failure when devising a decision.This also ties in with the overconfidence bias since if the managers had doubted the reactor in the first place, more precautions would have been taken. In this case, such a great disaster had never happened before among Russia, and since the power plant as utter before was assumed to be very reputable and exceptional, the managers of the plant had leave out any probability of the experiment going wrong. According to Kletz (2001)The managers do not seem to have asked themselves what would occur if the experiment was unsuccessful.Before every experiment we shoul d list all possible outcomes and their effects and decide how they will be handled. 2 The second biggest bias of the owners and constructors of the plant which caused the accident was the functional fixedness bias. As it is stated in Wikipedia (List of Cognitive Biases 2012) This bias limits a person to using an object only in the commission it is traditionally used. The reactor was waged in a rule-based behaviour, meaning that the operators were informed on what tasks they should complete but not told why it was so important to complete them.This had caused them to operate the plant in a way which Kletz (2001) states as process experience rather than theoretical knowledge. Before the Chernobyl accident, all reactors were designed and relied on the fact that rules would be obeyed and instructions would be followed so there was no need to set up extra protective facilities. This of course could have been the worst approach to building a nuclear plant, considering the fact that the workers were not trained to their best abilities.Instead of relying on the traditional method of assuming operators would follow the rules, the reactor should have been built in a way that the rules could not be repeld. That way the workers would not have been limited to using their low information on how to run a power plant and applied science would have done this job instead of them. In short, the traditional way of relying on man-made decisions should have been abandoned and relying on automatic equipment should have been adapted. Assuming operators would obey rules brings another issue to light, the projection bias.The projection bias is defined as unconsciously assuming that ones personal emotions, thoughts and values are shared by others. The lack of communication between the managers of the power plant and the operators in how seriously safety measures should have been taken is among the biggest causes of the disaster. According to Kletz (2012), the managers of Cherno byl had talked just about getting things done without any mention of safety, leaving the operators with the impression that safety is less important.Managers should remember, when giving instructions, that what you dont say is as important as what you do say. 3 Last but not least, the biggest error in judgment the operators could have had was caused by the ostrich effect. This bias is the act of ignoring an obvious contradict sitution. The big question is, why should any operator ignore situations which could cause the death of many people including their own? The answer lies in how the management system was established.Because the reactor relied on decisions of the higher authorities and not on protective safety equipments, every little detail of the power plant had to be consulted with the managers. As Kletz states (2012), Everything had to be referred to the top so it was necessary to break the rules in order to get anything done. Running a power plant should have not relied o n this kind of system since operators were more likely to take shortcuts, not inform the managers or simply ignore problems so that they could get things done quickly. Had these biases and errors in judgment not occured, the accident would perhaps never have happened.In operating such intricate systems such as a power plant, one must keep in mind two crucial things Always having protective equipment installed and never permit workers neglect safety rules. Unfortunately as humans, only after this disaster have we began to take these precautions, making us victims of the normalcy bias. In any case, we must always look out for human errors that might comport to irreversible damage. 4 RESOURCES Marples, D. R. , & Snell, V. G. (1988). The social impact of the chernobyl disaster. London The Macmillan constrict Kletz, T. (2001). Learning from accidents.Retrieved from ftp//193. 218. 136. 74/pub/anon/ELSEVIER-Referex/1-Chemical%20Petrochemical%20and%20Process%20Collection/CD1/KLETZ,%20 T. %20A. %20(2001). %20Learning%20from%20Accidents%20(3rd%20ed. )/Learning_from_Accidents_3E. pdf European Commision, world-wide Atomic Energy Agency & World Health Organization. (1996). One decade after chernobyl Summing up the consequences of the accident. Austria IAEA List of Cognitive Biases. (2012). In Wikipedia. Retrieved November 16, 2012, from http//en. wikipedia. org/wiki/List_of_biases_in_judgment_and_decision_making 5

No comments:

Post a Comment