Previously, the thinking was that most airline pilots weren't using the AOA information anyway - they had included the feature because some airlines had ex-military staff who were used to having it, but most didn't. So the AOA being wrong wouldn't necessarily be that urgent a problem, if it was only the pilot we were talking about.
Unfortunately, it wasn't only used by a small number of pilots, but also by MCAS. It seems that MCAS kicks in during scenarios that weren't originally intended - i.e. take-offs as well as stall scenarios - and they got two other things wrong:
* MCAS was revised to be more aggressive toward the end of testing;
* MCAS has a bug / design flaw that allowed the system to command an increasingly steep nose-down each time a pilot overrode it.
There were lots of little signs all over the place that there was a problem. Unfortunately, they didn't seem to join the dots, and each seemingly-reasonable decision they made when taken together added up to a dangerous scenario.
> "they didn't seem to join the dots" is what it will be made to look like.
IMO, there would have been engineers and even some of the management chain who knew these risks, but for whatever reason, would have been pressured to make this go away. It seems like a conspiracist thing to say, but I've seen this in software engineering for years. There are heaps of perfectly legitimate pressures which will result in reduced quality and cutting of corners.. just in my work, lives are not at risk.
I would be sceptical that anyone wouldn't want to join the dots if it meant that their most popular product stayed popular and they didnt end up with blood on their hands after many many people died because those dots remained unjoined.
At worst it might be plausible that in a very specific scenario some people knew about enough of those dots that they knew it would be a PITA to go back and make everything right, but they almost certainly didn't know about enough of those dots to actually understand the scale of their fuck up there.
In which case I would say it's still a matter of someone not joining the dots, it's just makes them moderately more culpible, then steeply more incompetent the more of those dots they knew about but still didn't flag as problems.
I'd be willing to bet there are engineers who are experiencing a lot of guilt about this right now. Not many.. but maybe even a handful. These are people who might not have been able to do anything about the problem had they realised the true scale of the risk. But I'd guarantee they exist, even if only in a number one can count on one hand.
It's a failing of the company culture where these engineers were not made safe to speak up; a missing facility for those who knew, to get the message to those who would do something about it had they known. It was likely a function of unhealthy (toxic?) company culture which precluded this psychological safety.
I bet these folks feel pretty bad.. maybe even responsible. They won't speak up tho, lest being labelled or even targeted with the full blame. They will suffer in silence, in some ways like the many soldiers ordered to do things they didn't think were right, but did anyway.
Possibly. But the point is, there's practically no way they can scape-goat at lower levels over this: management's role is exactly to connect those dots (or, at least, put in the safety culture which ensures the dots are connected, and manage it appropriately).
A systemic failing like this, combined with the more relaxed attitude the FAA took with them, is probably about as clearly a management failing as they could get. I'd put it as even worse than the VW scandal: at least in that instance there were specific cases of malpractice / collusion.
Unfortunately, it wasn't only used by a small number of pilots, but also by MCAS. It seems that MCAS kicks in during scenarios that weren't originally intended - i.e. take-offs as well as stall scenarios - and they got two other things wrong:
* MCAS was revised to be more aggressive toward the end of testing;
* MCAS has a bug / design flaw that allowed the system to command an increasingly steep nose-down each time a pilot overrode it.
There were lots of little signs all over the place that there was a problem. Unfortunately, they didn't seem to join the dots, and each seemingly-reasonable decision they made when taken together added up to a dangerous scenario.