"In his letter, Jacobsen recommends that Boeing upgrade the MAX’s autothrottle logic to either disconnect or give the pilots a warning when the computer registers invalid data.
In the upgrade to the MAX that allowed it to return to service, the FAA did not require any such change but did add an explicit instruction that pilots in this kind of emergency should “disengage the autothrottle.”"
In the many articles about the MAX that have shown up on HN the underlying concern seems to be that this is still not a safe plane. This excerpt is scary.
Of course it's not a safe plane, common sense says so. Would you trust a bunch of bureaucrats and corporate sellouts with your life, when engineers have been ringing alarm bells for years?
Not only will I never fly this plane again, but I'll do my best to avoid Boeing altogether. Given that I'm European, it should be fairly easy.
Seems irrational given the probabilities. Do you stop driving non-Tesla cars? That should give you a bigger bump in expected mortality (obviously depending on your relative mileage)
The right figure to look at is the number of fatalities per million miles traveled. Those 400 covered an awful lot of distance over that year (but also, not all of them were yet flying at the beginning of the year).
> The right figure to look at is the number of fatalities per million miles traveled.
The right figure to look at is how many plane crashes there are with other commercial flight planes of the same class due to design defects covered up like Boeig did, which is zero.
The comment I was addressing was "Do you stop driving non-Tesla cars? That should give you a bigger bump in expected mortality (obviously depending on your relative mileage)".
You seem to have mistakenly believe that I'm defending Boeing here. I'm not. The following two things can be simultaneously true: The 737 MAX is less safe than other aircraft, and the 737 MAX is still safer than driving.
Look at how many car crashes leading to fatalities there were over those two years and it's a lot more than the mere 2 with the 737 MAX. The point still stands; the only reasonable way to make comparisons here is not by absolute number, but per passenger distance traveled. Any other comparison isn't correctly normalizing the things being compared.
> The comment I was addressing was "Do you stop driving non-Tesla cars? I'm not.
That's your personal decision, and one you feel you meet to be vocal about on public forums.
Others feel the exact opposite, and flying Boeing is a risk not worth taking due to the company's track record.
Think about it for a second: your decision comes down to either fly with company A or company B, who have pretty identical offers, and company B flies planes with known safety problems. Not a hard decision.
This is about Airbus bribing to get more deals, which is abhorable but common practice in any multinational corporation even these days, be it soda, weapons or banking. Boeing is about utterly incompetent management killing 300+ civilians +-knowingly, and acting like little children. Still refusing to take blame, still refusing to actually fix the source of the issue. As a passenger, I and my family are in direct risk from this incompetence.
Yes, in almost duopoly in civilian flying, I will take Airbus anytime. And never, ever, fucking ever 737 Max.
What I was trying to say was: yes, you probably shouldn't fly the 737 Max in particular (and perhaps 787), no doubt about that. But to write off the entire company's fleet of airliners when they still obviously have planes that continue to maintain excellent history seems kind of extreme and unwarranted. Like presumably you think they somehow compromised on safety on their other models too, despite their safety records, to bring them into the market. Shouldn't that give you pause on Airbus too, given they've actually gone so far as to to bribe people to get their planes into the market?
I've more critical of Boeing than some folks here in the past, but if your goal is safety, you're just limiting your own options by rejecting planes with provably good safety records. (Or if you somehow think that's warranted, then there seem to be plenty of reasons to question Airbus's integrity too.) OTOH, if your goal is to just do this as some kind of economic retaliation (a "vote with your wallet" thing), in which case, never mind, I misunderstood your goal as being safety-related. Though you're not the person who wrote the initial comment, so I guess your reason could be different.
This is a loss of confidence situation, not one based on deep fundamental analysis.
As much as all other Boeing airframes can be safe, I'm not confident at all with the way Boeing's management runs the company. What other cost-cutting measures have been implemented? How can they affect current airframes maintenance and reliability? It probably doesn't affect them much but it's been blunder after blunder from Boeing, from the 777 built at Charleston, the 737 MAX fiasco and the 787.
As an airline customer I'm completely dependent on trust that Boeing was doing things correctly, from a safety standpoint at least. I don't have that trust anymore and it'll take decades to earn it back.
You're also forgetting the fact that Boeing still maintain the current aircraft. What if they cut corners in some updated manufacturing design or material? What if a new update to an FMS introduces faulty design into how something or other works. Yes, you're totally right if planes rolled out of the factory and was never touched by Boeing again, but that's not necessarily how it works.
I mean, if they were going to do that, why would they start doing that now? By most accounts I've read, their culture supposedly changed with the McDonnell-Douglas acquisition in 1997. They could've and would've already started doing so long ago before all this mess burnt their reputation, and we would've seen its effects on other planes' track records, yet I don't think we've seen anything to indicate this. And it seems to me like they have far less of an incentive to start engaging in these shenanigans now.
> A conversational tactic in which a person responds to an argument or attack by changing the subject to focus on someone else’s misconduct, implying that all criticism is invalid because no one is completely blameless
> Excusing your mistakes with whataboutism is not the same as defending your record.
Actually, pointing out similar actions in peers of the accused is a valid method to demonstrate that such actions are not misconduct. Or, if we still insist on calling these actions misconduct, then a double standard has been demonstrated.
That said, GGP post did not point out similar actions. Airbus did not risk passengers' lives. That is the issue here: public safety.
If you read the entirety of what I wrote in all my comments here it should be fairly obvious how. You seem very keen to start a fight though, and I'm not sure I'm up for a match here, so I'll leave it at this.
As the other commenter pointed out, whataboutism would imply defense of Boeing.
I think we disagree on what whataboutism means. I think we're just gonna have to agree to disagree, because we're having a meta argument at this point.
The MAX had the MCAS installed to make it so that pilots could operate it under the type certificate of a 737. Otherwise, the plane would behave differently and would require retraining. What he's probably saying is there is nothing wrong with the plane's design without the MCAS. Financially, the 737 MAX wouldn't be a success that way.
>> What he's probably saying is there is nothing wrong with the plane's design without the MCAS.
That is a common IMHO misconception. Since the new version with "fixed" MCAS requires additional training, it does not follow that MCAS exists simply to avoid retraining and non certifying it the same. In that case, Boeing should have simply dropped MCAS and did the certification and training. Instead they opted for the near financial disaster and disruption to their supply chain AND failure to avoid additional training.
I suspect MCAS is covering for a very bad corner case in the flight dynamics.
FAA rules require on all planes that the control stick takes more and more pressure to pull back as the plane approaches stall.
The 737 MAX doesn't meet this requirement, apparently it gets easier to pull the stick back as it approaches stall due to engine nacelles. This is what MCAS fixes, as the plane approaches stall, it adjusts trim to provide that extra back-pressure on the stick.
No amount of additional pilot training will allow the plane to fly without MCAS, because this is a hard rule and it's not certifiable without it.
The idea this deficiency can be masked by software is perverse.
These parts of the regulations need to be true without electrical power. They're supposed to be a function of aerodynamics.
This is only superseded in absurdity by the fact there are two switches to disable the stabilizer trim motors, thereby making the airplane no longer airworthy while in flight.
Airbus fly-by-wire have an alternate law mode where yoke input has matching control surface deflection, just like cable or hydraulic planes. I did not say no software anywhere ever. I said using it to mask aerodynamic deficiency is wrong.
These are transport category aircraft. Not fighter jets. Positive static and dynamic stability is required. All of this is reviewable. Is the software?
It is important to note that the MCAS was implemented on the MAX because the larger engines, and larger possible thrust vector further away from the planes center of mass, make it much easier to flip the plane over.
The MAX is a bad aeronautical design that Boeing made more "controllable" through additional control system engineering. Problem is, the additional control system engineering was pretty bad.
> The MAX is a bad aeronautical design that Boeing made more "controllable" through additional control system engineering. Problem is, the additional control system engineering was pretty bad.
Yeah, it's really frustrating that the solution was already a band-aid, and they didn't even bother to do that properly so they covered up what was questionably fraud (the lazy way of not getting a new certification) with what was unquestionably fraud (lying about the MCAS system).
What's really sad though is it exposes just how defanged the FAA has become over the years, so now who the hell do we rely on to tell us what's safe for aviation?
When it came out, the 757 was a significantly larger and more powerful plane than the 737. It was not a direct replacement, it was a new model in a class above what the 737 at that time was. Then, after the Boeing 757 was already in the skies, Boeing kept making the 737 bigger and started impinging on 757 territory, which had previously been separate.
Not as I understand it. It was installed because of the Airbus accident in the South Atlantic. An extreme case, yes.
But with the engines moved back, the plane has the natural tendency to stall. Which is very common.
It's common practice to change procedures instead of certification.
I flew an airplane that was getting ready to be retired after it's been in service for ~30 years and we had a lot of emergency procedures and memory items (items that pilots have to memorize) that were add over the years after various incidents.
The 737 MAX wasn't necessarily an unsafe airplane, it's pilots were not made aware of how everything worked, which led to the two crashes.
It was both an unsafe airplane and pilots were not made aware of how everything worked.
A human pilot properly informed in advance could paper over an act of sabotage by a flight control system, but that does not make the airplane safe.
A single sensor reporting bogus data was used by a system to automatically take dangerous action. Both are dangerous. But in particular the latter, because it permits an excessive reaction to what is supposed to be merely stall avoidance, not recovery. Even stall recovery does not require anywhere near the kind of nose down amount MCAS was permitted to induce. That behavior is sabotage. If a human pilot did the same thing, with the same available bogus information, it would be incompetency. And if it lead to death, it would be manslaughter.
Is it common for commercial airliners to nose-up when approaching stall conditions, thus making them easier to stall? (Or something else, that sounds similarly dire to a layman?)
I understand that aircraft are complex systems, and they need to be used by experts, and that those experts are aware of, and have to work to mitigate a lot of catastrophic failure scenarios. Is the unstable flight caused by the forward-mounted engines a particular example of a 'typical' quirk of a commercial airliner?
That cannot be. It was widely reported that the complete South Western Airlines MAX fleet insisted on the second sensor being installed. And so they did. They bypassed the FAA rubber-stamp approval.
The FAA not knowing about this is not plausible. Everybody knew that. It was a major criticism on the FAA ability to control air safety.
Are you speaking of the AOA disagree "light" here?
I use scare quotes above because it's not even a light, but instead an icon on the main console display.
Two AOA sensors exist on all of the original 737MAX aircraft. The original MCAS would use one sensor at a time for its calculation in an alternating fashion, which would swap between flights.
The AOA disagree was only an indication for the pilots and wasn't an actual upgrade to the function of the original MCAS for those that paid for the icon.
Juan Brown of the youtube channel blancolirio has given really excellent coverage if you look back through his explainations.
Ah thanks, that was it, yes. They had two sensors, and just switched between them. The South-western upgrade was only an indicator for sensor failure, not a real input into the MCAS controller logic, as it should be.
If he really is the chief engineer and that has any meaning, for ex if he signed off the design, should he not maybe be jailed for his failure to know what he signed?
I don't find it surprising. I don't always know how exactly every small feature is implemented in the software projects I've been the lead in, and an entire plane is probably much more complex than any of those.
Aircraft software is different. Pilots are supposed to know this kind of thing when they get a type rating. The chief engineer absolutely should do. Knowing which sensors the autopilots are using is not even arcane, it's basic knowledge.
The article shows that safety engineers at the FAA would have had the requisite knowledge to see the problems with the MCAS system, but as OP said above, there was no "issue paper" from Boeing to regulators about the system.
It seems a bit presumptuous to call how the FAA dealt with Boeing "oversight". "Total subservience" might be closer to the mark.
there is IMO an excellent series of articles documenting exactly what the problems with MCAS were and how it went wrong. It's written by a pilot called Bjorn on leehamnews.com if anyone is interested.
The Challenger explosion is taught in engineering school as an example of how not to communicate safety concerns. If you go through the presentation, it's terrible: they basically presented raw data in chronological order using an irrelevant infographic when they should have been showing a trend between o-ring failure and temperature. While in hindsight we know that management should have been more cautious, I could not possibly blame them for being unconvinced by such a poorly communicated argument.
Having been involved with a large U.S. Government aerospace project that ultimately failed (there are a lot of USG program failures, and a lot of them in aerospace) Mr. Jacobsen's statement that "FAA leadership seems to be denying any wrongdoing" sadly seems to ring quite true. So does the FAA PM asking him why he was in meetings related to MAX even though it may not have been in his formal "swim lane".
There is generally a very sad reticence to acknowledge mistakes within organizations, but failure to earnestly acknowledge, show contrition, and learn from mistakes is dereliction of duty in the public sector in my view. In my own experience, I was a lone voice asking "What are we going do based on learning from this that lessens the likelihood of future such failures?" It was and is an unpopular question. I don't foresee any organizational changes taking place, even though they should. And I don't just mean staff changes, I also mean clear commitments to principle, intent, and wiser behavior.
I think instead of talking about the past 2 accidents so much, at this point the question is if the current system is safe enough. I wish he went more into the details of the software certification.
The key point I took away from the article is that some complex interactions with and misbehaviors of the autothrottle system remain unaddressed in the upgrades (or at least Boeing was not ordered to make changes there, and Jacobsen chose to make this public, so presumably they remain unaddressed), so I think it answers your question.
For some reason the pilots are still confident enough to fly the new planes, though they don’t have many options now that a big part of them have been laid off because of the coronavirus.
It seems that now the Boeing hopes that those complex interactions are rare enough that they don’t pose a serious problem. We’ll see in a few years.
One takeaway I got from reading many articles like this over the past two years: Airplanes are a lot like JavaScript, quirky and full of gotchas, and what the pilots rely on is being well-informed and having access to good documentation on the evil. Everything seems to rely on a complex system of diseminating errata through bulletins, checklist and manual updates and training. The best airplane might not be the most bug-free but the best documented and most well-understood. The worst airplane may be the one with the most undefined or unknown (by pilots) behavior.
Part of what made this such a fiasco is that the behavior and interactions of these systems were purposely not documented and kept not just from the FAA, but pilots in particular. This meant they had no chance to do their job well and operate the equipment correctly. Add to this manufacturers reaching for "pilot error" quickly and as a pilot I'd imagine being rather pissed off.
This seems interesting with the 737 in particular: It's been around for a long time and was probably considered well-understood. Boeing introduced the MAX telling everyone "it's still the same", and then threw pilots one gigantic (as well as defectively implemented) curveball. Instead, deviations from and additions to a well-known system should have been pointed out and documented especially well.
It also seems to suggest there must be a system complexity ceiling where pilot recall of checklists and subsystem interaction permutations fails to scale anymore, and the addition of new systems probably needs to be evaluated against that notion. Assistance systems like MCAS are supposed to make things easier - but their edge cases and interactions potentially carry a "recall tax".
Another pattern I see in articles like this (another example is the A380 engine failure in I think a Quantas plane years ago) is that planes rely on informing pilots of subsystem state through what's basically a stack of notification message dialogs with the goal to get pilots to explicitly confirm/react to each one, but that means by the time the pilot has time to work through the stack during a crisis the content may already be outdated, or there's no time to tend to the stack to begin with. There may be better UX designs possible such as master system diagrams with status? Dunno.
Note: I'm a layman, and in topics like this I feel uneasy whenever I relay or suggest what may be based on wrong assumptions.
Also on the subject of documentation, I understand that the manufacturers regularly update their manuals. the owners are legally required to have these manuals. They are therefore compelled to shell out exorbitant sums of money for something that should be free. Can't remember where i picked up this tidbit.
The thing that should really be discussed is that given the full regulatory capture of the FAA over the last 20 years, how many other such hidden timebombs exist in the complex software systems in aircraft.
As a software tester, it's actually fairly amazing to me that planes these days don't fall from the skies every day.
It's because we test planes and use old designs for a long time, hard for a 30 year old plane to fall out of the sky for something that hasn't been found in 30 years.
That remains to be seen. However it still doesn't inspire a lot of trust. Wondering what would the fallout be if another MAX crashed because of another ugly hack turned deadly.
I feel like if the average person knew just how right you are, not just about Boeing but about almost every major company, there would be unending riots across the U.S. Management pretty much across the board only care and understand one thing: money. You want safer planes? Take every Boeing manager's bonus and stock away for a few years. That'll get their attention.
I would image that if another Boeing 737 Max crashes, especially if it crashed in the United States, that it would be the end of Boeing as an airliner manufacturer. I could see all the 737 Maxes ending up in an aircraft boneyard, and airlines looking at alternatives for other classes of aircraft too.
I think Boeing is just in the too-big-to-let-fail group-- especially if we enter a post-pandemic world on a rebound. The US wouldn't let such a key company fall -- especially as China pushes their competing airplanes.
A US-crash, I think, would push greater scrutiny and overhaul of the FAA. Boeing could shift some of the blame to them. And they could actually benefit in the long run (as terrible as that is.)
That's not quite what it says. Rather, his prospects for gigs involving FAA interaction from the other side:
Now sharing his concerns with the press for the first time, he’s risking his post-FAA employment prospects.
....
After retiring from the FAA, Jacobsen hopes to work part-time.
Someone with his credentials would typically find lots of lucrative freelance work at smaller aerospace companies who need help navigating the maze of FAA regulatory compliance to certify their products.
Now, he may struggle to get such gigs if he’s perceived as an antagonist of the agency.
“I recognize this could cost me future employment opportunities,” Jacobsen said. “But I feel like my allegiance right now is to these families.”
You're right, I misinterpreted it. I thought the "post-FAA employment prospects" was the package, but of course it means actual post employment opportunities. I'm just skimming through it too quickly.
As of right now my post has 7 upvotes and no downvotes which means 7 people agreed with me. That means either a lot of people didn't read the article or they misread the article like I did.
I wonder how many downvotes my post will get after they read your comment which essentially categorically proves my initial reply is completely and utterly wrong.
Which goes to show how many people just skip over reading the article and go straight to the comment section for a summary/full analysis.
I upvoted you, trusting your summary in lieu of having read the article. I've now changed that to a downvote. I read the articles to maybe 1/3 of the stories whose comments I browse.
The great thing about HN is that misinformation is quickly rectified. Usually.
It’s because people get a dopamine hit from reacting emotionally to stories.
On the other hand, reading before reacting requires resisting going after the dopamine hit and actually works the other way. You have to get sadder consuming time and attention to read material that may or may not be informative.
A large percentage of Hacker News is the reaction (probably a majority). And if you point out that they haven’t read the article, some people treat it as a personal attack because it shows them in bad light. Then they start arguing.
Not always. I now have 10 upvotes. That likely means more than 10 people voted me up and didn't see that the following reply renders my statement completely incorrect.
Do you think our prisons have enough capacity to take more people in? I think the proper punishment for the leadership would be something like getting their retirement package away and leaving the industry without an ability to get back in.
US prisons house about 0.7% of the population, or about 1.4M people (depending on the source).
The US has about 200,000 CEOs. Not all CxO’s are crooks, but if they were, we could jail all the CEO’s and an average of 6 of each one’s direct reports using our existing prison infrastructure.
I’d wager the average criminal CxO does significantly more ongoing damage to society than the average person in US prisons would if released.
I’m not endorsing the idea of prison time, but the short answer is: yes, of course there’s room, assuming we’re talking about C-suite level executives and maybe 1 or 2 hops below. That’s just a handful of potential prisoners and would be quite manageable.
Our society does not try to solve high-level problems with prison, only low-level ones.
Steal $500 from your boss, you may get prison time. Your boss steals $500 from each of his employees through wage theft, and the worst he would face is being out some money from losing a civil labour lawsuit.
That's because most high level behaviour has an incredibly high bar that it needs to meet to be criminal, with a lot of subjective argument over intent.
The bar for low level crime is much easier to meet. Take something that doesn't belong to you? Hit your neighbour with a club? Ingest a taboo mind-altering chemical? All of those are crystal-meth-clear violations of the social contract.
Hundreds of people died. I agree that prisons are used way too much (especially relating to drugs, three strikes crap, fare dodging, crimes relating to poverty aka petty theft) - but gross negligence that leads to multiple deaths should warrant some prison time, if only to serve as an incentive for management to not cut corners.
Hard to disagree with him when he says to just take out MCAS. It was never a critical component that was required for flight. That's what Boeing should have done.
Am I wrong in remembering that the reason the MCAS is on the MAX in the first place is because they moved the engines to a position which creates dangerous situations, which was compensated by software (MCAS), so that Boeing could avoid having to re-certify the airframe?
Ah yes, the bad decision feedback loop. Instead of fixing the root cause you just add another layer of bad decisions. I suppose the most recent layer is the "good enough, let's just sweep any further complaints under the rug" logic
I wonder what the pilots of the Max have to say about the airplane? There could be greater potential for human error if pilots are distracted by the idea that something could fail at any moment.
This isn’t different than when flying any other aircraft. Something can already fail at any moment from the engines, avionics, control systems, hydraulics, and electrical systems. That’s why pilots have emergency training and emergency checklists. The question is whether Boeing is providing the appropriate training to deal with new potential failures introduced by the new design.
"The ET302 pilots, however, jumped immediately to the step in the checklist that Boeing emphasized in its bulletin after the Lion Air crash: hitting the cutoff switches to stop MCAS from pushing the jet’s nose down. In their rush to do that, they didn’t first bring the nose back up with the electrical switches and didn’t disengage the autothrottle."
What the bulletin (actually an EMERGENCY AIRWORTHINESS DIRECTIVE) says is:
"Initially, higher control forces may be needed to overcome any stabilizer nose down trim already applied. Electric stabilizer trim can be used to neutralize control column pitch forces before moving the STAB TRIM CUTOUT switches to CUTOUT. Manual stabilizer trim can be used before and after the STAB TRIM CUTOUT switches are moved to CUTOUT."
Note that this procedure was followed in the first incident of MCAS failure, and the Lion Air airplane recovered and landed safely. (That same airplane crashed on the very next flight with a different crew on it.)
The part about “the pilots simply did not follow the procedure” does not appear 100% factual based on your quote. Consider:
> Electric stabilizer trim can be used to neutralize control column pitch
Maybe EADs are written in a different language than computer standards, but, when an RFC says that one “can” do something, it generally does not mean that one actually MUST do it at risk of failure to interoperate or death.
Don't pull quotes out of context. This was an EMERGENCY AIRWORTHINESS DIRECTIVE, meaning "how to not die". It was not a memo about remembering to wipe the sink after using the lav.
I expect pilots, when handed an EMERGENCY AIRWORTHINESS DIRECTIVE, to:
1. read it
2. understand it
3. remember it
Flying is no joke. John Denver died because he forgot to put fuel in his airplane. JFK jr died because he flew into haze without IFR training. AF447 killed everyone aboard because the pilot forgot that you push the stick forward to recover from a stall rather than pull it back.
And Boeing should never have created a flight critical system that was vulnerable to a single point of failure.
But horrific. "To prevent the computer from flying your plane into the ground perform this complex and counterintuitive set of steps that may involve muscle forces you don't have". Nice. How about "make computer not fly plane into ground?"
1. trim back to normal with the electric trim switches
2. turn off the stab trim system.
It simply is not complicated.
> muscle forces you don't have
Turning two electrical switches?
> counterintuitive
That's debatable, but consider that much of flying is counterintuitive - for example, stall recovery. Humans have not evolved to fly, and do not have the right intuition about it. That's why pilots get extensive training.
Keep in mind that all three crews experiencing MCAS failure used the electric trim switches to trim it back to normal. One of them then turned it off, the surviving crew.
On the plus side, finally, after maybe a hundred articles about the 737MAX, the paper finally acknowledged that the pilots did not follow the emergency procedure. Progress.
> How about "make computer not fly plane into ground?"
Where's the "autothrottle" in the EAD you quote? From the article we all comment:
"Jacobsen points out that the FAA’s emergency directive after the Lion Air crash lists the procedure pilots should follow — but omits the instruction on the autothrottle and fails to mention that it could malfunction."
The way I understand it, that's the new information in the article to which you respond with quoting... exactly the very EAD which obviously lacks any mention of it, confirming the claims from the article.
I understood that autothrottle was what made the plane reaching the speeds under which it was impossible to use human forces to prevent the plane from crashing?
In the upgrade to the MAX that allowed it to return to service, the FAA did not require any such change but did add an explicit instruction that pilots in this kind of emergency should “disengage the autothrottle.”"
In the many articles about the MAX that have shown up on HN the underlying concern seems to be that this is still not a safe plane. This excerpt is scary.
Anyone with aviation experience want to weigh in?