Human errors and the Abstract System

In Rewriting Joseph Harris his offers three ways to counter the text we are reading.  I personally find it easier to use the method of “arguing the other side” easier to counter.  Arguing the other side offers the chance to show the other side of what the author is saying.  Even if you agree or disagree with the author you can argue the other side of the text and opinion on the text.  Harris defines Arguing the other side as, “Showing the usefulness of a term or idea that a writer has criticized or noting problems with one that she or he has argued for” (57).

Giddens uses a lot of examples from other authors to make the base of his arguments.  Some of them he will going into detail and explain his arguments and others he will touch and very briefly and not provide a lot of detail into his argument.  As we all know Giddens wrote this book many years ago and a lot has changed since he has written the book.  This makes it easier for us to argue the other side of his arguments because there has been change and we could have a complete different view on what he is trying to prove or argue.  Giddens says “neither design faults nor operator failure are the most important elements producing the erratic character of modernity” (152).  Giddens is saying that every things comes with design faults and like we all know we are humans that operate the equipment and a human isn’t perfect.  Will the abstract system ever completely fail?  We know parts could fail by human error or equipment design fault but will it every completely fail?

Before the abstract system and in pre modernity there was also human error.  Giddens doesn’t go into detail in his argument about that at all.  Like I have said in this blog before we all know humans are not perfect and there will always without a doubt be some type of error when humans are involved in the operation of something.  They may not have noticed it as much because of the updated modern equipment that is used in any operation today compared to the equipment that was use back in the day.  This isn’t anything new to society it is just noticed more today than it was before or even when Giddens had written his book.  With the abstract system in place there are more highly qualified people working on operations so when a mistake or error is made we notice the error a lot sooner and it seems to be a lot bigger but in reality there are most likely less errors accruing then before the system was in place.  This leads in the key term of trust.  We are putting a lot of trust into the people that are working within the system that everything done right and there are no faults or errors that happen and to keep the system running smoothly.  We trust companies when order things offline that the right item that you order will arrive to your home and it not be something you didn’t order or that it never shows up after that have taken your money.  I believe we all know that the chance of an error is there and it is a high chance be we have built the trust that we feel comfortable allowing the system to continue to operate.

JF

8 thoughts on “Human errors and the Abstract System

  1. Great post, Giddens does fail to bring in the cases that abstract systems failed at the forefront, while he mentions nuclear disasters, he fails to dwell on them as a weakness of humanity and also of operator error. You dig in where Giddens brushes over, revealing modernity does not minimize risk. On the contrary it makes risk larger than it ever was, trust at the levels where apathy can only reign. Great tragedies started in these abstract systems from plane crashes which are negligible, to the First World War which shaped our century. There the complex systems of statecraft, failed in spectacular ways.

    Like

  2. Your entire blog post was extremely insightful. I especially liked your comment saying, “As we all know Giddens wrote this book many years ago and a lot has changed since he has written the book. This makes it easier for us to argue the other side of his arguments because there has been change and we could have a complete different view on what he is trying to prove or argue.” I have noticed that many of Giddens’ arguments seem outdated, which makes it easier for us to disagree with him.

    Giddens discusses what he calls “operator failure” (Giddens 152) which is similar to human error. He seems to believe that this is always a negative thing, yet I think of this in a different way. Some of the best inventions have come about due to accidents or “failures.” I do not think there should be a negative connotation about this.

    The previously mentioned example relates back to your point about disagreeing with Giddens, due to the time period. Thank you for bringing up such a great point!

    Like

  3. You brought up a really valid point in your blog post that I personally had never even thought of. “As we all know Giddens wrote this book many years ago and a lot has changed since he has written the book.” I found this to be extremely insightful and may be one of the reasons it is hard for some of us to relate with Giddens and his ideas. To add on to your idea of abstract systems, how I think of abstract systems is that with the emergence of them comes an increase in the division of labor. Therefore, people are becoming more specialized to a specific job and therefore there is less error occurring. Like you stated, “…but in reality there are most likely less errors occurring than before the system was in place.”

    Like

  4. These are all great points. Some statements you made in your post was really insightful, such as “Will the abstract system ever completely fail?” and ” I believe we all know that the chance of an error is there and it is a high chance be we have built the trust that we feel comfortable allowing the system to continue to operate.” Many human errors, and operator failures still exist in our time. From little things to calls being dropped to identify theft and security breaches. Giddens also says on page 153, “But even if it were conceivable–as in practice it is not–that the world could become a single design system, unintended consequences would persist.” Even as we become more technologically diverse and advanced in our systems, there is also unintended consequences, as increasingly as we get more advanced, the market for security, asset protection, and safety precautions have increased exponentially. I think this is our way of combating these potential errors in the system. Are there any examples of safety precautions you can include that show that trust in the system might be as high in 2014 as it was in the early 200’s?

    Like

  5. Your post had many parts which I fully agreed with you on. When you said “This isn’t anything new to society it is just noticed more today than it was before or even when Giddens had written his book.” I thought to myself, maybe it is because in today’s society we emphasize the importance of performing a task perfectly. If a pilot doesn’t perform his duties perfectly many lives are at stake. Perfection is what is desired in today’s society, and one little slip up can ruin your reputation. Technology today has become so advanced that it rarely has a malfunction. As Giddens states “Any abstract system, no matter how well designed it is, can fail to work as it is supposed to because those who operate it make mistakes” (Giddens 152). Operator failure has become the main cause of many tragedies. Can you name a few that were caused because of operator failure?

    Like

  6. James,

    As you point out, operator error is not something unique to modernity: “This isn’t anything new to society it is just noticed more today than it was before or even when Giddens had written his book.” At the same time, he argues that the stakes of certain mistakes are much higher than they were in the pre-modern world. What do you think of this? Were there some mistakes that would (or did) have catastrophic consequences on a global level?

    ~DM

    Like

  7. Your comment realizing that “Giddens wrote this book many years ago and a lot has changed since he has written the book” was insightful and I too have thought of it but I haven’t applied it like you have. So much has changed since he has written his book, but the one thing that has stayed true is human error. I liked your thoughts on human error because it will always be around until humans don’t inhabit the earth anymore so all we can do is limit it. Do you think human error, or any error for that matter, will ever be eliminated, or do you think that error needs to be accepted as an irremovable fact?

    Like

  8. I certainly agree with you that arguing the other side is an easier way of countering Giddens. His varying approach, as you so well have noted, that “he will going into detail and explain his arguments and others he will touch and very briefly and not provide a lot of detail into his argument” makes Giddens an easy target for countering. Even when Giddens goes into ‘great detail’ his subject matter is so broad his points are controversial at best.

    Concerning your Giddens view on abstract systems completely failing, I’m not sure I understand your position. Giddens writes “but the element of operator failure cannot effectively be incorporated into such calculations.” (p152) I feel that considering human failure in abstract systems has always been a consideration, to the extent that the original design scope required it. Simply stated, increasingly harmful consequences which arose out failing abstract systems have always been met with improvements of those systems within the confines that technology could offer. If we were to stretch our view, would you agree that certain ‘unintended consequences’ have led to abstract system improvements and even the creation of new abstract systems unimaginable at that time?
    -Chris

    Like

Comments are closed.