Although I cannot claim to be in the mainstream of contemporary culture, even I have heard of the Oscar error. I should immediately state that I have no specialist knowledge about Oscar ceremonies, because I have never watched one, though I have seen many brief highlights of acceptance speeches (a maudlin art form in their own right), and have no opinion about the actual movies, because I have not seen them. So, this is blank state stuff.
However, procedural errors are a very interesting matter, in that they involve intelligence at two stages. First, a person carrying out a procedure has to think through what they are doing. Second, that person, or preferably others, have to look through the procedures and think about the errors which may arise, and how to avoid them.
The airline industry has done that very well. Plane crashes cause many deaths in one go, which certainly draws public attention. The industry has worked hard to avoid errors because frightened customers are less likely to fly. Most improvements have been technological, in the sense of making instruments and controls easy and intuitive to use, but there is careful screening of pilots (with some glaring exceptions) and standard checklists to overcome forgetting of important matters. There is training to recover from errors and infrequently encountered hazards.
Try putting “Germanwings” in my search bar for some comments on the monitoring of pilots. For an attempt to make sense of the then recent loss of a plane, see: http://www.unz.com/jthompson/regular-guys-pilots-on-flight-mh370
Surgical teams have been slower to learn, but have now cleaned up their act by following airline industry procedures. All major industries attempt to reduce errors. Motorola led the way in manufacturing, and Toyota and many others all have their systems.
Just to show how everything links to intelligence and intelligence researchers, that book begins with Spearman (1928) who complained “crammed as psychological writings are, and must needs be, with allusions to errors in an incidental manner, they hardly ever arrive at considering these profoundly, or even systematically”.
James Reason has a good classificatory system, and a set of explanations which link up with well-known psychological findings, namely that despite the capacity to take in many sensory messages, people have limited channel capacity. The multitude of inputs has to get through the bottleneck of working memory, and some inputs never make it, and are lost. Operators also have to remember what they intend to do, and the steps they have to take, the sequence being triggered by the correctly remembered completion of the previous step in the grand plan. In complex cases the grand plan has to be dropped in favour of a new plan, as when engines fail and a plane must find a place to land in the next few minutes.
The Oscar error was an unintentional error, a slip or lapse, probably caused by distraction, maybe because of the operator tweeting pictures from backstage, not part of his key duties. The procedural error was apparently not having a systematic way of cancelling (discarding) an unused envelope. It is a traditional “place-losing” sequence error, with the added piquancy that the “double envelopes” routine is itself intended to be a safety routine. Safety sometimes creates danger. Fixes are easy to suggest: giving the envelopes numbers in sequence, having the name of the prize in very big letters on both sides of the envelope so that announcers know what they are announcing, having the latest discarded envelope showing at the top of a transparent trash can so that the person handing them out can see them, and so on. There are several thousand operators who can suggest improvements.
The fascination for researchers is that each step in an intentional sequence has several consequences, not all of which are easy to predict. As a general rule, failure to predict is an indicator of low ability, but that must be considered in terms of the complexity of the operations being undertaken. One systematic problem is the inability to imagine improbable scenarios until they happen. Incomplete fault trees are legion, and often undetectable on close inspection. If you present a fault tree with sections missing, even skilled operators rarely notice the omissions. In theory, safety systems should catch these errors. James Reason describes each set of safety systems as slices of Emmental cheese intended to stop errors having fatal consequences, until by chance all of the holes in the cheese line up. Bhopal had three systems, none of which operated properly. His analysis of the Chernobyl explosion is fascinating, particularly piquant because it was caused by a badly planned test of a safety system.
In the case of the Oscar ceremony, having the stars come out onstage from two wings rather than one is an obvious procedural hazard. Cutting that out would be a prudent step. In that way you halve the number of envelopes. The Rocket Sled engineers would have spotted anything comparable in the 1950s, and Motorola would have had the stats in their Six Sigma project in 1986.
You might be tempted to say that Chernobyl, Bhopal, plane crashes and industrial accidents are more important than a movie awards ceremony. I could not possibly comment, since I am not among the millions who regard the latter as interesting, but they illustrate an important limitation of human operators. All of us sometimes misperceive reality, and lose our place in a sequence.