>> the horrific culmination of a series of faulty technical assumptions by Boeing's engineers, a lack of transparency on the part of Boeing's management, and grossly insufficient oversight by the FAA.
That is being very kind to Boeing management. Its blaming the engineers and the FAA but the management was just not transparent? Boeing fought hard for their autonomy and ability to self certify, so I guess the FAA failed by allowing that. But that doesnt exempt the management at Boeing who was supposed to create an organisation with processes in place to design and build safe airplanes. They had such an organisation but failed to maintain it as such. That's more than a lack of transparency.
> the horrific culmination of a series of faulty technical assumptions by Boeing's engineers, a lack of transparency on the part of Boeing's management
My experience in this industry is that many of the most important engineering decisions are made by management. Not to excuse their engineers, but on the topic of management getting of lightly, I wonder how many of those "faulty technical assumptions" should fall at the feet of Boeing's management.
The only engineers with real power are the ones that own the company. And even then, they can still be out-maneuvered by politically astute executives.
Engineers are almost never to blame in these situations, even if they are the direct cause of an issue. There's usually so many of them that someone catches problems. But they are powerless to do much more than send a strongly worded email to their management.
Until C-suites and board members are held personally liable for tragedies like this, I doubt much will ever change. There's no real incentive to fix problems, when it costs too much money because the worst thing that will happen is, a few years down the road, there might be a Congressional inquiry and a modest fine.
Of course, any engineer whose malfeasance costs the company even 0.001% of that fine will find themselves in jail at the behest of company leadership, even if no lives were lost or any measurable damage was done to the company..
>Engineers are almost never to blame in these situations
I strongly disagree with this. Engineering is a profession of public trust, even if operating under an industrial exemption to circumnavigate licensure. We need to hold ourselves accountable to those standards or else it risks becoming a vocation devoid of responsibility and lacking accountability similar to the C-suite positions that people rail against in this thread. I don't think society would cut a doctor or lawyer slack for 'just doing as they're told' and I think engineers are in the same area of public trust.
Good organizations should have dissenting opinion processes. The advantage is that it gives a way to hold management accountable (e.g., they must formally acknowledge that risk). Sure, using that process may come with professional risk, but that should come with the territory of a position of public trust.
No engineer had the power to veto this move by the company, just like no engineer had veto power over the Challenger launch or any of the other numerous tragedies we all discuss in our ethics classes. Attempts were made to prevent the tragedies, but nobody in charge listened. The fault lies with the person who has final authority over the decision.
If I make a mistake, which is discovered, as long as I attempt to correct the problem, I'm absolved of fault. A miscalculation is an accident. Knowingly deploying a faulty device due miscalculation is malfeasance. I may have made the calculation, but I never made the decision to let it kill people.
I think where we disagree is the level of responsibility for "final authority". If an engineer knowingly designs a failure, they should be held responsible as should the management. I don't think those two responsibilities of engineers and management are mutually exclusive.
I don't know about how NASA handled dissenting opinions in the Challenger days but my understanding is they have grown into more of a risk-informed decision making culture. I do think they have engineers who can stop work and force management to formally accept that risk nowadays. I don't know if Boeing has similar processes.
This isn't a miscalculation scenario. This is knowingly not following their own processes. At least according the the hazard analysis reported by the Seattle times, their own processes said there should have been a default redundant sensor. Not as optional equipment, but as the default configuration. The engineer may not have the final say but they do have a responsibility to drive that risk decision. If they know they are doing something dangerous or unethical, in my (perhaps unpopular) opinion they have a responsibility to voice that or leave. I say that as somebody who has been pressured in these types of decisions in an aerospace domain.
Challenger was somewhat different. Challenger was from a lack of temperature data, meaning they didn't have a good probabilistic rationale for either GO/NOGO decision. In hindsight, the safe bet was to wait for conditions that fell within the known data. As I understand the Boeing case, they identified the hazard but didn't follow their own procedures to mitigate it. That isn't really the same case of an "known unknown" but rather an "known known" that they didn't mitigate. I do think the engineers are culpable to a certain extent there.
>I may have made the calculation, but I never made the decision to let it kill people.
I think this may go along with our differing opinions on the standards of the profession. In my view, there is a certain level of expected competence one should be held accountable for. That’s why doctors can be sued for negligence even if it was an “honest mistake”
> If they know they are doing something dangerous or unethical, in my (perhaps unpopular) opinion they have a responsibility to voice that or leave.
How do you know they didn't?
They will keep leaving and management will keep firing people until such a time they get people that will go with their decision.
Management has final say.
> Challenger was from a lack of temperature data
No, they had data alright. They had data from previous flights showing O-ring degradation. There were two, and at least one flight the seal got broken on the first O-ring completely and started on the second. NASA's own procedures would not allow launch if they were routinely exceeding their safety margins, so those got changed.
On the day of the fateful launch, engineers responsible for the boosters raised concerns due to the low temperatures. Those were escalated to a very high level in NASA. They were ultimately overruled.
Maybe they did, but the point I was countering was that they don't have responsibility, not that they have to stay. I.e., I disagree that you can ethically both shed responsibility and stay doing the job knowingly in an unsafe manner. If the end state is they only hire people who agree, it stands to reason both the engineers and managers share culpability. For those engineers who hold licenses, that becomes a backbone-stiffening measure. Not only do you approve said design, management needs you to in order for the design to be legal.
>They had data from previous flights showing O-ring degradation.
This isn't the same as saying they had data regarding the O-ring reliability at the low launch conditions of the day of the catastrophic failure. IIRC, the unique condition was the launch was occurring during previously un-encountered launch temps. "Raising concerns" isn't the same as saying you have incontrovertible evidence; that was the main crux of the decision. There will always be people raising technical concerns on these programs. Without good relevant data, schedule risk outweighed the un-quantifiable technical risk. It's been a while since I read the report so maybe I'm wrong on this.
IMO, Columbia is a better analogy. It was a known out-of-spec condition they decided not to mitigate because they were lulled into complacency
He did his job but that doesn't mean he has final say. I've read enough interviews to know he carried a heavy weight with him until his death because he felt he should have pushed back more. I think there's some confusion that I'm advocating an engineer must stop all risky actions at any cost. That's not what I'm saying. I'm saying if an engineer doesn't bring up a risk because "The boss doesn't want to hear it" that's willful negligence. Fighting for a position and being overruled is different that meekly rolling over.
If you believe engineering is a public trust profession, you owe it to the public to at least do due diligence. My issue is the people in this thread saying "It's all managements fault" and displacing any responsibility from the engineers. The engineers are the technical authority for management. We should strive to make sure management understands that technical risk; if they do and proceed anyway I think the engineers have done their job. That's what I think happened with Challenger. That's different from placating management because you're afraid for your job or plowing forward knowing a design will put people at risk.
If the bar was to get every program engineer to give a GO, there would never be another launch. There are engineers who don't trust aircraft that have flown for decades in part because we are bad at judging overall systemic risk. NASA has since instituted formal dissenting opinion processes and distinct technical authorities to allow risks to be raised and formally acknowledged without grinding the process to a halt.
"Boisjoly and four other engineers tried desperately to convince management that the launch should be scrubbed. They warned that the cold would cause rubber o-rings to become brittle and fail, allowing hot gases to leak at the joints. Morton Thiokol and NASA managers dismissed the arguments, and decided to go ahead with the launch.
The next day, Challenger lifted off from its pad at Kennedy Space Center. At first, the launch appeared normal, but 73 seconds into flight the shuttle exploded, sending fragments arcing across the bright Florida sky. In view of a horrified public, including thousands of schoolchildren watching a live broadcast, seven astronauts plummeted to their deaths in the Atlantic ocean."
Virtually all American whistle blowers lose their jobs and are placed on a banned list. They are branded traitors and conveniently become sexual predators, rapists and pedophiles. From the ensuing manhunt, you'd think the whistle blower's the head of a terrorist organization. It's straight to prison if they are caught. And a constant looking over their shoulders if they escape the country.
But "management", the guys who actually did something illegal? Life goes on uninterrupted,... retire and become an amazon director.
Knowing the risk involved, how can anyone expect any rational engineer to leak stuff?
If not a single Boeing executive goes to prison for this, then this exercise is a sham, looking for helpless scapegoats to salvage the company's reputation and reduce its loss of plane orders.
>Knowing the risk involved, how can anyone expect any rational engineer to leak stuff?
Because that is the ethical mandate of the profession. “Profession” as in the root of the word to profess a vow. That consideration doesn’t come with the caveat of “as long as it’s convenient to your career.”
I get that it’s not easy. I just don’t like how people will give themselves ethical loopholes in a professional realm but not a personal one. As I stated in a different comment, I don’t think Challenger is the same as this Boeing case. In any event, NASA has instituted procedures to try an empower engineers to hold managers accountable
> That consideration doesn’t come with the caveat of “as long as it’s convenient to your career.”
When losing your career potentially means losing your home, your health insurance, and putting your family out on the streets; it's a lot more than just "an inconvenience" to your career.
What needs to change is that there need to be more stringent protections in place for engineers who blow the whistle on this kind of stuff and prevents companies from effectively destroying the lives of those who blow the whistle.
>it's a lot more than just "an inconvenience" to your career.
Luckily, we're talking about rare cases. Would you extend the same to a doctor who puts a patient at risk because he needs to make a mortgage payment?
Maybe I'm unreasonable, but I think if someone isn't willing to hold an ethical line they shouldn't be in a career of public trust. There's no shame in pursuing other professions, there should be shame in undermining the public trust because you aren't willing hold that line.
Yep. And EVA-23 almost became another catastrophe 10 years later.
I'm not making a claim they are perfect, but they may strive to be. Columbia instituted more changes, including a completely separate safety technical authority. This is a separate signatory who must approve operational decisions and (in theory) doesn't face the same schedule pressure. I'm still somewhat personally skeptical if this will avoid such further incidents because much of these are rooted in humans inability to understand risk in a statistical manner.
It depends. I think it can be driven informally or formally. Do you think other public trust professions (doctors, lawyers, judges) are on equally weak foundations?
Informally, you have the onus to create a culture of accountability and each individual has the responsibility to uphold it. One way you do the opposite is for people to claim "engineers are almost never to blame". Maybe this is a third rail to bring up at this moment, but you can see how much culture matters when bad policing surfaces. Would you say "Beat cops can't be blamed, it's only the chief's fault"?
Formally, you can lean on licensure requirements. Requiring an official "approval" from a licensed engineer (and the accountability that goes with it) helps ensure the public trust oath isn't just a rubber stamp.
It reminds me of people who blame programmers for "dark patterns". Sure, they implement them but they aren't the ones who decide that they should be implemented. However, both programmers and Boeing engineers could (should) quit if they feel what they are doing is unethical.
Anyone working in engineering knows that any form of resistance would have been futile. I bet the engineers of MCAS didn't even have a view on the overall system and their solution probably fits the risk- and requirements analysis. You should be able check that, that is why we focus on documentation in security relevant branches.
Responsibility is for the stupid, certainly not for management.
>> Anyone working in engineering knows that any form of resistance would have been futile.
That's a management failure right there. But fine, if managers want to take a "shut up and do as I say" attitude, then they need to be responsible for the results of what they said to do.
Also, the thing they were least transparent about was probably hiding the differences between the Max and the regular 737. They hid those differences so that they could be certified as the same type and not require additional pilot training. That's not a lack of transparency, it's deception.
> then they need to be responsible for the results of what they said to do.
"You clearly misunderstood what I said. I never said to design MCAS. I wanted you to build MCAAS. The extra A implies the redundancy the system requires. Sheeesh." Phew, now I'm no longer responsible for this mess.
That's not true. In industries like medical and aerospace engineers have more power than they think. I've been there and done that. Safety is relevant still so you can't throw you hands up and do nothing. They don't have the same mentality as FANG companies, safety, redundancy, and being vigilant is part of the job. However, things will happen if everyone isn't vigilent. There are lots of companies out there doing it the right way, don't throw them all under the bus with Boeing.
Not intending to throw Boeing under the bus at all and I do think engineers take security seriously.
But generally there is a R&D lead that makes the decision to implement it. There may be responsibility there, but the decision to design the plane this way was an economic one.
Some reported internal pressure to develop quickly. By now, that is probably standard in the industry where any software is involved. So apart from an engineering position that could nearly crash or delay such a project...
If the system reactivated wrongly, it is certainly a bug and I think they have already fixed that problem. But an error that is hard to identify and the system was basically just designed to conform to regulations defining flight characteristics to evade renewed certification and training.
Of course there are still valid design criticisms possible if you take security real serious.
If there are "the engineers of MCAS", with a narrow mandate to build the MCAS, there must also be other engineers who don't delve into MCAS details but understand the overall system and therefore understand that the MCAS is dangerous. If not, how was that thing allowed to fly?
There absolutely would have been systems engineers in charge of the various budgets and margins baked into the design. Safety is always a top one in aviation. In fact, Id be surprised to find there wasn't a team of systems safety engineering on the max. We need to see their analysis to understand whether this was a fault of engineering or management. If the engineers had done a thorough systems safety analysis and quantified the risk, then the responsibility definitely does not fall on engineering.
Competent systems engineers are in charge of high level decisions in theory.
In practice, they might be less in charge than they should be (so that, more or less criminally, managers prioritize business objectives over safety and quality) or less able to do their job and less competent than they should be (probably because better processes and higher standards would mean spending money to cause "problems").
It's never any engineer's job to make decisions/define constraints. That's on the business side whether it be management, marketing, analytics teams, etc... I seriously doubt any engineer decided this. The decision to go with a single non-redundant system and use software to improve safety margins seems like a cost cutting measure which definitely seems more like something that would have been mandated as one of the design goals. Maybe engineering presented various options or something like that, but I find it very hard to believe the risk decision laid anywhere else other than very senior management.
> I bet the engineers of MCAS didn't even have a view on the overall system
By system so your mean the MCAS, the airplane, it Boeing. I don't think you're correct if your mean the MCAS or how it integrated with the airplane. You don't think they had any sort of integration tests established? Also, one one the failings was that the system only relied on input from a single sensor, so that failing should have been extremely obvious.
Unless by system you mean Boeing as a whole. I'm sure that as a part of the requirements process they were told that the system can be deactivated (which it could be), and I'm sure they were probably unaware that pilots were not being trained in disabling the MCAS. That makes those technical failures still obvious, but nonetheless understandable and maybe even acceptable in that context.
I meant to say that few things are that obvious if they aren't specified. There should be a point in the risk analysis that handles sensory errors, possible consequences and risk mitigation and if it wasn't specified there, it would have been an error. Go up the chain from there.
I doubt that software engineers necessarily know much about the reliability of sensors. You mostly learn that from experience. That is why you should include as much experience in the analysis process as possible. A pilot or the air techs would probably have mentioned the tendency of sensors freezing or having some form of malfunction.
Only then can the softies develop a fault tolerant system. After a failure it is always obvious to everyone, but as you said, perhaps they just thought the pilots would notice the error and override the system. Also an obvious assumption that should be documented as mitigation to reduce risk.
No risk analysis is perfect, but after the accidents we can expect some diligence and discipline in my opinion and hopefully this accident leads to a review on these processes instead of finding some fall guys.
A thorough integration test would probably also have detected the issue at some point and maybe here are faults as well.
The need for training was there. The selling point was certification saying there was a lack of need. In part that's also a fault of each purchasing airline that didn't give extra training.
Regardless of whether Boeing were willing to certify the system as not being training, those responsible for pilot competence at the purchasing commands should still have ensured pilots were trained before using the system.
If a garage sells you a roadworthiness/MOT certificate when a vehicle isn't roadworthy, and you know it isn't, then we're both at fault.
Depends how the engineering teams are partitioned really. MCAS could be built as a closed system with a set of interface specs that the avionics teams would then have to integrate and test. That said, someone definitely should be doing integration tests.
Furthermore, I doubt that the lack of ‘transparency’ was entirely accidental. At least since Enron, the management of large enterprises has become adept at encouraging or subtly
coercing its employees to do what it takes to get the desired results, while they remain nominally unaware of what is being done.
They're not stupid. What do you expect them to do? Being transparent clearly comes with risk and 2020 clearly increases risk. To not insulate against that would be to not do their jobs. You get what you incentivize.
For starters, display good leadership traits like accountability, including to themselves. I'm worried the signal we're sending right now is that weak leadership traits are rewarded which incentivizes weak leaders.
I think we're saying the same thing. My point is that such weak leadership should at least be dis-incentivized. There's plenty of ways to do so from fines, to jail time, to making the individuals industry pariahs. We incentivize what we value; it seems like we may be valuing money over character.
Who are 'them'? The executives of these companies? Yes, of course some of them will act this way at least some of the time, and the question is what do we do, if we don't like it? We have some options, such as preferring candidates who favor stronger accountability and effective protection of whistle-blowers, and who are opposed to installing, as the political appointees heading the regulatory bodies such as the FAA, people to whom these executives can go to to get pressure applied to the professional regulators to back off their objections to what is going on. We can also support responsible investigative journalism by subscribing to it.
So, yeah, I'm not expecting things to improve anytime soon.
I'm sure there will be some people who refuse to fly on 737 Max. Those people will have to track airlines and routes to avoid airlines that fly 737 Max and/or routes where 737 Max is used, but also they're going to need to be willing to turn around in the jetway when they get to the plane and it's not the one that's scheduled and accept that they just lit their ticket on fire, and will have to pay for a new one.
When Boeing starts shipping 737-8200 instead of 737 Max 8, then most people are going to be releaved that they're not on a Max, when they are.
Personally, I'd like to see what experts think about the fixes, and let other people try out the planes first, but this doesn't seem like an unfixable problem. The proposed fixes seem to me like they'll be effective even if I think other changes would be better. Although there was a lot of new scrutiny, there weren't many other issues uncovered.
I fly, follow the news, and assume I'll probably fly one one at some point because there's not enough alternative options. It's not what I want, but if they have a fix proven out by a lot of successful flight time and a MAX flight has the arrival time I need.. why not?
Part of the reason Boeing is going to survive is the pure fact that Airbus can't make enough airplanes to cover the market.
COMAC is nowhere near ready and will have to prove itself in the domestic market before international rollout. That said Chinese aviation safety is among the best after a very rough start, but the real issue is airplane sales involve geopolitics, and geopolitics is not looking favourable by the time COMAC is ready.
>gifted them the benefit of the doubt on that front
I haven't seen anyone discuss this, but China was responsible for (justifiably) grounding the MAX around the world and crippling Boeing short term. CAAC was the first to ground Chinese MAX fleet after Ethiopian crash, which inspired everyone else to despite FAA insisting MAX airworthiness. I'm sure US is going to remember that when COMAC is open for deliveries, not that the US will ever buy, but the smearing is going to be rough.
Astonished? Really? They have sooooo much money tied up in this plane. They have soooo much money in orders to fulfill. It's not like they can just say, "instead of the 737Max you ordered, we're giving you this nice 777 instead". Of course they want this plane back in the air as quickly as possible. The airlines do too (maybe less with COVID).
I used to say something similar, but then someone challenged me to listen to all the ATC conversations, so I did. There are literally thousands of mayday's and pan-pans called every year, many that come dangerously close to a crash. The passengers rarely even know something is going horribly wrong. It seems those planes are really old and falling apart. At least, that is the perception I take away.
I am really impressed with the flight crews ability to work as a team to get the thing on the ground upright most of the time while remaining calm on the radio. Don't get me wrong, I love flying and plan to get my pilots license some day. I am just not a fan of the aging fleet and very dated technology.
I think about this every time I take my corporate CYA training - the courses you have to take every year to ensure you follow security protocols, don't break the law, don't harass people, etc. Since there's a record that I took the classes, the company can blame any problems on me.
Blame in an engineering context is rarely if ever productive. But, if you have to blame, you really need to check if the issue was raised to management before you blame. In this case I'm pretty sure the issue was raised, so it's not really on the engineers.
To what extent should we be faulting the SEC for requiring quarterly earnings reports, which is a rather meaningless frequency for an aircraft company? How many companies have cut corners on safety due to earnings report requirements?
To what extent should shareholders take some of the brunt for their fickle behavior and threatening to short Boeing stock if they don't sell sell sell earn earn earn, instead of holding long term for Boeing to resolve safety issues at the right pace, which in turn sets the behavior of internal managers at Boeing?
> Republicans on the committee did not endorse the investigative report. [..] criticized Democrats for an investigation that "began by concluding that our system was broken and worked backwards from there."
So, do many republicans hold the view that killing 346 people in the first year of service is acceptable business practice?
The system is supposed to prevent people from dying, 346 people died, the system is broken, it is a fact. Of course we work backward from there.
Even if the conclusion is that it is "acceptable business practice", it doesn't make the system less broken. Just as making a bug "won't fix" doesn't make the bug disappear.
But this is all political bickering, I don't think republicans are proud when American planes crash and kill their passengers, some of them US citizens. But because the report is written by democrats, and I suspect it is a little about safety and a lot about attacking the republican government, somehow, republicans feel the need to fight back instead of trying to find solutions.
I hope there are real engineers trying to actually solve the problem behind the scenes.
I agree with you and the report itself has some inflammatory and accusatory information that while seemingly a conviction of Boeing is also a validation that the FAA has created a system the rewards these behaviors.
As a parallel I submit to you the FAA Medical process - a system by which it is better for a pilot to conceal or not report until terrible a condition that could invalidate their medical and remove them from flight duties.
While both of these are bad for the consumer, I ask - who is at fault? The big company, the pilot, or the people making the rules that encourage such behaviors.
Shouldn’t all three bear some fault? The big company for not proactively supporting their staff to report (e.g. benefits), the pilot for doing something ethically wrong, and the regulatory body for encouraging the broken system?
Trust can't really go further down than '0' which is roughly where you are right now. The FAA will not be able to walk away from this one without a decade(s) long effort ahead of them to recover the trust they once had.
>The system is supposed to prevent people from dying, 346 people died, the system is broken, it is a fact. Of course we work backward from there.
Isn't this just outcome bias though? You're arguing that because there are 346 deaths, the system is necessarily broken. That may not be the case. From what I've seen, it probably is, but it's not a foregone conclusion, and we shouldn't look for evidence supporting that conclusion. 346 deaths might actually be a very small number compared to what could have happened had the system actually been broken. The point is, it's a logical fallacy to say that because there were 346 deaths, the system is broken, and we need to work back from there. This could have been a rare combinatorial explosion of bad luck.
Not defending their actions, but just throwing your hands up and saying the system is broken (and implying the system should be replaced wholesale)isn't productive either. Why not start first by fixing the specific pain points in this system that lead to this scenario rather than replace it and see where that gets you. I feel like democrats have a very deconstructionist perspective these days towards nearly everything. When I do my job I don't start by saying the internet is broken and needs to be replaced. I think sometimes we need to balance our idealism with a couple doses of pragmatism.
I make software that is broken ALL THE TIME. And what do I do then? Find the faults and then fix them. Just because software is not perfect does not mean that I sit like a dog in a house fire saying, "this is fine."
Killing 300+ people is broken. And our obligation is to admit that it is broken, and fix that thing. When we find another thing, we admit that the system is broken there as well, and go fix that.
Trying to diminish the seriousness of 300 deaths with mealy-mouth terms like "imperfect" is pure spin. If a plane, staffed by pilots allegedly trained on how to fly it, crashes from pilot error... TWICE, the system is broken. terribly broken.
See also: Security, physical or digital.
Terrorists flew planes into the World Trade Center. The system was broken. We fixed some of it. We put on some secuity theatre too. But we didn't say, "It's impossible to fix everything, so the system is fine." We didn't shrug it off as "imperfect."
We always begin by being truthful with ourselves about the fact that we have discovered that the system is broken. And if the consequences of its broken-ness are unacceptable, we fix it.
"Imperfect" is a word that should only be used for acceptable faults. Like, "The seating in economy is imperfect." It's too close together, but we don't have planes falling from the sky because they're cramming passengers together.
And now a footnote: Please avoid ad hominem arguments like "please name one system you've..." If the speaker's argument has a logical fallacy, point it out. If the speaker's argument is sound, it doesn't matter whether they write faultless software, or even whether they write software at all.
It's the argument we are discussing, not the person making it.
> Trying to diminish the seriousness of 300 deaths with mealy-mouth terms like "imperfect" is pure spin. If a plane, staffed by pilots allegedly trained on how to fly it, crashes from pilot error... TWICE, the system is broken. terribly broken.
Wait until you find out how many people die in car crashes every single day. It does not mean “the system is broken”. In the real world systems can be improved without throwing disruptive tantrums to make fundamental changes.
>Terrorists flew planes into the World Trade Center. The system was broken. We fixed some of it. We put on some secuity theatre too. But we didn't say, "It's impossible to fix everything, so the system is fine." We didn't shrug it off as "imperfect."
Excellent example of an extreme overreaction that caused immense destruction to the aviation industry because emotional politicians wanted to fix a “fundamentally broken system”. The correct approach would have been, “we’ve found a big flaw in security, we will now install flight deck door locks”. Instead, some dim politicians went with the “system is broken” approach and now we have nudity machines (or grope-downs if you don’t submit) at every major airport in the US.
You don’t drastically change a system on the discovery of a single flaw, no matter how big. You fix the flaw because it’s one known bad vs the giant pile of unknown flaws with fundamental redesigns.
"Wait until I find out?" Don't patronize me, I am keenly aware of automobile deaths, and furthermore, it's a complicated problem to solve for many, many reasons, some of which are caused by greed, lack of oversight, recklessness, and other social issues.
Nevertheless, in my lifetime cars have gotten safer by several metrics, most notably by distance traveled. And they did not get better by thinking that driving around in deathtraps without seat belts and swigging whisky behind the wheel was acceptable.
They got better when people like Nader and MADD stood up and defied arguments like yours and loudly complained that yes, the system was BROKEN and needed to be fixed.
And then other people went about trying to fix automobile safety. Imperfectly. In fits and starts. And sometimes entirely wrongly. But nevertheless, progress happened when people attempted to fix things they acknowledged were broken.
All systems do have flaws. Even if we had very intrusive security, 9/11 may have still happened. It's possible no amount of security could have prevented it.
And it's possible in attempting to launch several thousand pounds of metal into the air that sometimes your testing is off and people die. That may mean the system is broken, it may mean it's just imperfect - it is not sufficient for either of those claims by itself. That people died does not mean the system is inherently broken - life is not without risk nor should our goal be to make that risk non-existent.
Security, to use your example, is fundamentally a trade off - accessibility vs security. Are we willing to trade a more intrusive screening process to prevent some set of attacks? Even if we make that tradeoff and an attack still happens, should we keep it? Would you be willing to be stripped search each and every time you fly if it lowered that risk of an attack another 10%?
Your call out of an ad hominem is incorrect. Parent is generalizing human behavior in an uncertain world, not attacking you as an individual. Nobody can write a perfect system, even NASA has failures. A super abstract argument about flawless systems loses to real life statistics every time - theory is great but only reality matters.
I think we must reiterate to help you “get” it. One accident is a tragedy, a sign of imperfection, and we allow for accidents in most cases.
The difference between an accident and what happened here is that the issue was caused _deliberately_ by a series of bad choices, reflected in the fact it happened twice in exactly the same way. Yes, it took a lot of bad choices to get to the point of creating the issue, but that doesn’t really matter, and in fact, might only serve to further the point.
When a system allows for deliberate choices to cause an issue, the system needs to be addressed.
Anecdotally, I have never worked at a company where the management didn’t have an outsized say in how we broke apart our time between product development and stability.
I suspect this is endemic of a system that lacks accountability for people in managerial positions, and is probably even beyond the scope of the FAA, MCAS or Boeing.
Have you considered you may not be correct here? I know that may be difficult to comprehend, but you are not a teacher shedding light on the unenlightened - you're just another layperson attempting to understand a complex system.
> what happened here is that the issue was caused _deliberately_ by a series of bad choices
Are you suggesting Boeing intentionally downed it's own planes? That would seem to be an extraordinary claim - and not one reflected by any news source.
> When a system allows for deliberate choices to cause an issue, the system needs to be addressed.
What was the deliberate choice here? Only enabling one MCAS sensor or having calibration issues? Was it not making the need to disable the MCAS sensor in some situations clear? Was it changing the altitude model without changing the plane model number and forcing a pilot relicense?
Or, was the issue with Boeing being able to do their own testing? Was it due to a lack of guidelines on MCAS sensor calibration and placement?
> I suspect this is endemic of a system that lacks accountability...
Yet we are talking about this after their fleet was grounded, a congressional inquiry has happened and the investigation and analysis are ongoing. That doesn't seem like a lack of accountability to me.
Again, you're making the claim that the system is broken intrinsically because this occurred. You keep even saying it's deliberate - do you have evidence here? Because making poor choices is not the same as intent, and a bad actor does not a fundamentally broken system make. You seem to be in the position that because bad things happened we must dismantle the entire system without actually proposing a replacement - it is not enough to throw stones, we must actually put forth a solution set.
"Killing 300+ people is broken. And our obligation is to admit that it is broken, and fix that thing."
This is a massive trivilisation of the problem.
This is not 'code' - it's a massively complex system of large companies, various bits of tech, supply chains, varying international standards, considerably amount of legislation and that's before we get into the humane issue of 'acceptable loses' because '0 deaths' may not actually be feasible. Maybe, but maybe not.
"Terrorists flew planes into the World Trade Center. The system was broken. We fixed some of it."
Again, this is a misrepresentation. The system was not 'broken' because culturally, people 'didn't do things like that'. The 'fix' for this wouldn't be 'more security' but rather to encourage a civil culture where people don't take over planes and fly them into buildings. And some of the solutions are systematic, i.e. not 'safety checks on planes' but externalities like 'invading countries and destroying places where terrorist plan things' - which isn't so nice.
These are very complicated problems that don't have obvious solutions.
They may not have obvious solutions, but we recognize that they are broken and we try to find solutions. We then find places the solutions don't work and we iterate or even replace the solutions with new solutions.
All systemic problems have complex sets of interactions and unexpected consequences. Racism, automobile deaths, and yes, air transportation.
We still don't shrug our shoulders and deny that the causes of deaths reflect a broken system, nor do we refuse to attempt to find solutions just because it's damn hard to make progress.
: Sometimes, making the vehicle safer makes drivers more reckless. Making cars safer in practice is not as obvious as it appears in theory.
> do many republicans hold the view that killing 346 people in the first year of service is acceptable business practice?
First you very carefully selected what to quote. You cut out this part:
> A statement from ranking member Sam Graves of Missouri says, "if aviation and safety experts determine that areas in the FAA's processes for certifying aircraft and equipment can be improved, then Congress will act."
Which makes it clear they mostly disagree about what the investigation results say about FAA, not Boeing business.
Then you suggest the republicans not being ok with the investigation findings means they are ok with killing people. They do not agree with WHY and HOW these people got killed, obviously not disagree that this should have not happened.
How you can misrepresent the other side so badly and still be so self righteous is beyond me.
I disagree that anything was misrepresented. After two airliner crashes I think it is clear that the "system" (in this case the ability of the FAA to provide sufficient oversight to ensure safety of these planes and companies like Boeing to provide the FAA with the information they need) is broken. Wouldn't we all agree that the entire point of this inquiry was to "work backwards from there" and figure out where things could be improved?
I'm less clear on what Sam Graves means when he says "if aviation and safety experts determine that areas in the FAA's processes for certifying aircraft and equipment can be improved, then Congress will act," but I can see why people would be upset. It seems to me that we don't need input from anyone after two crashed airliners, be they safety experts or not: there clearly are areas in which the FAA's processes can be improved and they should be improved ASAP.
The FAA's approval process was the outcome of ideology that government should not work the way it had been working. That there were too many regulations and too much oversight, and now it was time for businesses (Boeing in this case) to have a stronger role in certification and FAA a reduced role. Whether this ideology is about protecting corporate masters, supporting a sort of neo-feudalism where big companies are the new aristocrats, or whether it's sincere belief that free markets always produce better results - almost doesn't matter. The consistent theme is that public servants should not really have the final say on airplane certification. Say they have the final say but ensure they don't have the funding, political, or legal capital to consistently actualize it.
The reality is one party long believes in giving business a lot of slack, and then only speaks in grumpy platitudes about the limits of business free-for-all when people die. They don't want a bullshit idea to be seen as bullshit. They just want to profit as much as possible with a number of deaths that the public finds acceptable enough to not call b.s. on the system itself.
If any event involving hundreds of deaths at the same time could be pinned on a person, they'd go to prison. This system is expressly designed to spread the blame and obfuscate just enough that fines will be paid, and life goes on. Except for those who died.
So, corollary. Should we completely rework the process for certifying cars (safety testing) and drivers (drivers licenses, training) due to the FAR more than 346 roadway deaths last year?
Adding just a few lessons from Aviation - things like "only allow the use of spare parts that are approved by the manufacturer", "forbid use when diagnostic errors are present", "mandate pilot rest periods and duty cycles" and "revoke the license of anyone with a sufficiently serious medical condition" would have huge consequences. I'm personally confident that each of those would reduce driving fatalities in the US by more than 346/yr. Of course, they'd also destroy the livelihoods of millions, make car travel far more expensive, and have (prior to 2020) unfathomable social consequences. But hey, any system that allowed that many deaths must be broken!
Perhaps you’re just used to dealing with an industry that has little physical risks where the thought of people dying in a normal functioning system seems hard to believe?
Hundreds to thousands of people die everyday in vehicles and the NTSB doesn’t even open investigations. It’s regarded as a well-functioning system. Are Democrats fine with thousands of deaths every year?
But a lot of the causes of car accidents ARE systemic!
"driving while 80 years old"
"driving while on 4 hours of sleep in the past 72"
"driving with the brake warning light on"
I'd count all of the above as "systemic", and bet that they cause a multitude of deaths. We just don't want to accept the societal and monetary cost to eliminate them. Others, like "driving while intoxicated", we penalize but still do not take more than superficial steps to combat. (Superficial from the perspective of aviation, at least)
> Which makes it clear they mostly disagree about what the investigation results say about FAA, not Boeing business.
Yes, but for me, regulatory capture seems to be a central issue in what went wrong, so I am highly skeptical that this is not an ideologically-flavored conclusion.
FWIW, I first approached this incident (prior to the second crash) from the point of view that it was probably either pilot error and/or an unfortunate malfunction, so I think I can fairly say that I have come to my conclusions in a forwards direction.
Nobody can come up with an alternative. the only people who are experts in something are people who could be good in either a regulatory role or a business role. Thus there is and always will be a revolving door because when one side decides they need more experts the obvious place to look is the other side of the door.
Note that without the revolving door both sides would be worse off. (Unless you take some other action to mitigate this - I can't think of anything but there might be a complex answer)
One of the reasons DARPA program managers sit a fixed term is to precisely prevent careerism and castle building. The reason members of congress have a staff is to support them in researching topics.
But ultimately it comes down to the public, being informed by media, holding politicians responsible. I may disagree with portions of Bernie and AOC's political platforms, but one thing I don't have to worry about with them is that they're in the pocket of a company like Boeing.
We get the government we ask for. Want better? Raise your expectations and vote accordingly.
Another approach could be sufficently good communication. If you need former experts in writing the regulation to be compliant or need to work in the industry to know how it works enough to regulate it implies that the communication on both sides is missing crucial information.
The process of doing so and ensuring it would be very laborious in that you would have to document every last detail, go through and make sure they were covered, make sure they were retained, and make sure that their understanding of every last detail is identical. Making matters worse is the mix of implicit and explicit. If you use a sprocket on something the design implicitly states "Do not allow it to rotate that way!" An added strut on a panel already more than capable of supporting itself and the rest of the structure may be redundant or needed to shift resonant frequencies into a range where they would be irrelevant.
Indeed, and there are plenty of people in the industry who understand the problems and want to do the right thing. These are the people we need to give a voice to. People who understand, for example, that the arguments for not informing pilots about MCAS were self-serving sophistry with no technical or human-factors justification.
How often does an entity, when investigating itself, accurately and truthfully identify the issues the entity is facing? I have immense respect for the FAA and if any regulator could do it I would think the FAA would conduct a more honest self-investigation than most US regulators. But it’s still a pretty dubious proposition that people in Congress responsible for the agency’s oversight would defer to the judgment of the FAA in determining the agency’s accountability.
It’s ironically not unlike the regulatory landscape that led to the Boeing situation in the first place - the entity charged with oversight was much too deferential to those it was supposed to be overseeing.
>> A statement from ranking member Sam Graves of Missouri says, "if aviation and safety experts determine that areas in the FAA's processes for certifying aircraft and equipment can be improved, then Congress will act."
This statement alone is hilarious considering that the FAA approval is a poster-child example of regulatory capture.
Here's the secret: the FAA let's the companies SELF-REGULATE and SELF-APPROVE planes.
The FAA did NOT approve the Max, because Boeing did.
>>A total of 79 companies are allowed under federal policies to let engineers or other workers considered qualified report on safety to the FAA on systems deemed not to be the most critical rather than leaving all inspections to the government agency.
>>To critics, it's a regulatory blind spot.
Once you know how this works, that statement by bad-faith-actor Sam Graves is laid bare. Of course we know how to improve the processes: Step 1: Don't put the Fox in charge of the Hens!
>>> How you can misrepresent the other side so badly and still be so self righteous is beyond me."
Something you yourself could have learned from before posting!
"Here are the total contributions from Boeing to members of the House Aviation Subcommittee during the 2018 election cycle. Republicans: Troy Balderson (R-Ohio) $0. Brian Fitzpatrick (R-Pennsylvania) $9,700. Mike Gallagher (R-Wisconsin) $5,999. Garret Graves (R-Louisiana) $6,000. Sam Graves (R-Missouri) $10,000. John Katko (R-New York) $15,400. Thomas Massie (R-Kentucky) $0. Brian Mast (R-Florida) $7,681. Paul Mitchell (R-Michigan) $5,000. Scott Perry (R-Pennsylvania) $3,000. David Rouzer (R-North Carolina) $2,000. Lloyd Smucker (R-Pennsylvania) $8,000. Ross Spano (R-Florida) $0. Pete Stauber (R-Minnesota) $0. Daniel Webster (R-Florida) $0. Rob Woodall (R-Georgia) $2,000. Don Young (R-Alaska) $1,000. Total Boeing Contributions to Republicans on the Aviation Subcommittee $75,780. Average for each of the 17 members: $4,457.
Democrats: Colin Allred (D-Texas) $94. Anthony Brown (D-Maryland) $8,500. Julia Brownley (D-California) $0. Salud Carbajal (D-California) $5,000. Andre Carson (D-Indiana) $10,000. Steve Cohen (D-Tennessee) $2,000. Angie Craig (D-Minnesota) $703. Sharice Davids (D-Kansas) $122. Peter DeFazio (D-Oregon) $5,000. Jesus Garcia (D-Illinois) $0. Eddie Bernice Johnson (D-Texas) $6,000. Henry Johnson (D-Georgia) $1,000. Rick Larsen (D-Washington) $7,048. Daniel Lipinski (D-Illinois) $6,000. Stephen Lynch (D-Massachusetts) $0. Sean Patrick Maloney (D-New York) $3,500. Grace Napolitano (D-Washington) $0. Eleanor Holmes Norton (D-DC) $0. Donald Payne (D-New Jersey) $1,000. Stacey Plaskett (D-USVI) $0. Greg Stanton (D-Arizona) $2. Dina Titus (D-Nevada) $3,000. Total Amount Boeing contributions to Democrats on the Aviation Subcommittee in 2018 cycle: $58,969. Average for each of the 22 members. $2,680.
Total contributed by Boeing to the 39 members of the Subcommittee: $134,749. Average per member: $3,455."
The engineers are constrained in authority. Management is ultimately responsible.
It's not that management wants to kill people on a personal/visceral level, it's that they're willing to take on unacceptable risks in order to lock in short term monetary gains. We have seen this over and over again with modern large corporations, across industries.
And is it surprising management behaves this way? Their compensation is directly tied to short term stock value shifts. Meanwhile, when the game of musical chairs stops, they know that they'll still be able to make a lateral career shift to another similar company and just play the same game all over again.
I don't claim to know much about FAA procedure for approving flight of new aircraft, but I think it's safe to assume that whatever testing, regulation, other processes they have in place that allowed for this type of grave error to fly is either insufficient or broken.
I agree this doesn't exactly equate to the conclusion drawn about republicans, but it does further expose that there is no limit to generating partisan/political quotes where they don't belong.
Regardless of who you side with, note that many on the congressional investigative team took contributions from Boeing and tried to steer blame to individual pilots/airlines (including Sam Graves and some democrats).
>" I don't claim to know much about FAA procedure for approving flight of new aircraft, but I think it's safe to assume that whatever testing, regulation, other processes they have in place that allowed for this type of grave error to fly is either insufficient or broken."
That's quite the assumption. I don't even know what 'broken' would mean for a certification process as byzantine as the one used by the FAA. I think you should either be more specific, or take the time to learn a bit about the systems, processes, and failure modes of complex multi-stakeholder processes.
Let's make this easy, a fully FAA compliant aircraft fell from the sky, twice. Therefore, the certification process is broken, as in it does not actually work at judging the airworthiness of a new jet due the reason in the previous sentence.
> "if aviation and safety experts determine that areas in the FAA's processes for certifying aircraft and equipment can be improved, then Congress will act."
Climate experts determined things too. Congress has acted... by ignoring those experts. Why should we expect them to listen this time when they have demonstrated an unwillingness to listen in the past? Judging by history, they aren't planning on acting.
Working backwards from a desired claim to generate only supporting facts would reinforce only that claim. It’d be like the police only investigating the one usual suspect and refusing to look for evidence that did not pertain to that suspect.
Working forwards from gathering facts to a conclusion may lead to the same place or somewhere entirely different.
The use of the term "desired claim" strikes me as unreasonable and unfair in this instance. No one is happy with two airliner crashes, it was no ones "desired claim". Unfortunately these crashes are what initiated the inquiry into the certification of this model airliner.
In my opinion, the crashes are not facts with which anyone could reasonably argue. I don't think there's room to argue that maybe the crashes were unrelated, for instance. Given that we can all agree that the crashes occurred and were caused by a similar fault (in this case MCAS), I think it's entirely reasonable to "work back from there" and figure out _why_ this airliner was certified as safe and through that exercise discover what changes can be made to that certification process to prevent other dangerous airliners from being certified.
This isn't some open ended research problem. It's an investigation into something that went glaringly and obviously wrong. Of course you start from what went wrong and work backwards from there. If you didn't you might end up in a different place altogether which would serve no purpose. That's what a congressional inquiry is for: to establish why a particular thing went the way it did.
No, that's when you find new, plausible evidence to support the thing that you weren't allowed to do in the first place so you can pretend you didn't break the law while gathering evidence. This has nothing to do with the case at hand.
'Beyond a reasonable doubt' works because the system should generate doubts as well. There might be two theories that are equally likely, but if you only have facts to support theory one without even having thought of the other theory that might still lead to a wrongful indictment.
It is important to consider, though, that the reason flying is extremely safe is because we do freak out about everything that goes wrong in aviation. Commercial aviation is one industry in which there is a strong correlation between safety and profitability so the industry does freak out over every problem.
Of course, it's a balance, and the balance that commercial aviation has found is heavily focused on safety.
Well you might have a point in that our (global?) culture is becoming less and less prepared for looking reality in the eye.
Yet would one consider mandatory seatbelts and stricter laws on driving under influence and so forth "paralyzing regulation"? You are talking about the other extreme with its polar opposite of not having any rules. But there evidently is a huge middle-ground where we can have regulation and not having people killed in large numbers.
Consequently, if we look at history most of these "freak outs" have in fact resulted in regulation that has saved lives. Eg the gun laws in UK. We as humans are very short-sighted in general and I think it's more of a rule rather than the exception, that we need these kinds of accidents to improve the regulation and the laws in place. Sometimes maybe to the extreme, but I think it's better to be too strict than too lax.
That’s the thing with regulation though, it’s easy to point out some positive effects but you don’t know about all of the societal advancement you may have snuffed out. Climate change could have been a footnote if we didn’t effectively regulate nuclear power out of existence in the name of “safety”.
> Climate change could have been a footnote if we didn’t effectively regulate nuclear power out of existence in the name of “safety”.
Even if 100% of global elextricity generation was replaced with nuclear, that wouldn't be true. You'd have to also replace (for instance) 100% of transportation with nuclear. (And that's assuming that the infrastructure construction and maintenance costs of doing all that are carbon neutral, too, and that the reduced demand and price for fossil fuels resulting from that shift doesn't open up new uses that get you back into problems, which—absent aggressive regulation—youd naturally expect it would.) [55% reduction is CO2 is needed, electricity is 27%, transportation 28%.)
Or maybe you are thinking that the absence of nuclear safety regulation would result in enough accidents to reduce the growth of population and industry enough to solve climate change, which I guess is a valid thought, if an extreme example of glass-half-full thinking.
I generally ask if anyone anymore believes in cause and effect. By and large aviation is safe because of the work of governments and corporations together to create safe aviation. It doesn't just "exist." As we see here with the 737.
> we risk not being able to do anything new because it is paralyzed by regulation.
This isn't frontier science or sending someone to the moon. Aviation can be extremely safe and innovative if based on sound principles and not just trying to make an extra buck.
They said among developed nations. And they're right. Just compare the population normalized figures between the US and Canada. We have very similar cultures and road conditions, and yet the US is nearly 2x worse.
I knew roads in the US were dangerous, I didn't realize they were nearly as dangerous as Bangladesh or Egypt! No slight against those countries in particular but they are comparatively less prosperous (meaning older, less-safe vehicles, more pedestrians), have worse roads, and presumably not as advanced healthcare systems. I'm amazed the US, with all its advantages, does so poorly.
I read the FAA's final report, and to be frank, I feel like many punches were pulled. There were mountains of evidence of a dysfunctional culture and cavalier attitude toward regulation at Boeing.
Despite the punch pulling, there were also admissions that the plane (without the MCAS flight law) would not have passed certification requirements for carrying passengers, cementing a solid motive for regulatory obstruction and perception management.
Mary a mention or touching on any of that in the report. It went to great lengths to show that the "process worked as followed" yet never addressed the issue that said process led to 346 deaths.
It's frustrating, because I left the report with the feeling the conclusion was "Process is fine, people just need to follow it better" when the entire point of having a process is to take into account human capacity to err and designing it out.
I think Representatives are basically applying their plank to the report. Which is bloody stupid, because they need to be reading between the lines and focusing on the problem, not whether their party leaders approve of their approach or output.
For point 1 I don’t think it matters if they understand the technical design. As long as they understand the failures that led to a flawed design. Mostly likely the FAA trusting Boeing and a lot of managerial failures on Boeing due to management glossing over engineers and pilot feedback.
If the US system is even slightly functional, the FAA will get beefed up and we'll be fine for a decade or two.
Then people will again start saying "we've had no accidents for years, therefore flying is really safe, so why all this burdensome regulation?". Regulation will get scaled back, something like the 737 MAX will happen again, the FAA will get beefed up, and so on, ad infinitum.
It's this sort of regulatory boom and bust that makes me feel humans shouldn't be allowed to do anything that's dangerous on a really large scale (such as nuclear power).
The root cause is pretty easy to describe: benefits are apparent and risks are hidden.
It's much harder for people to weigh things that might happen against things that will happen. Especially when the former is negative and latter is positive (positive outcome bias).
Consequently, in meeting after meeting, risk safeguards are erroded. Because each meeting has a relatively minor "Do we want this benefit, or to hedge this risk?" decision, and in aggregate, people pick the former.
Or do your homework and find that it’s actually far worse than pro-nuclear propagandists constantly claim. Huge increases in rates of multiple forms of cancer near nuclear sites, while the rest of the country is seeing decreases - but not attributed to nuclear. Huge environmental problems and costs pawned off on the US dept of energy and taxpayers, but not attributed to nuclear. Assassinated whistleblowers.
Nuclear industry is a textbook example of regulatory capture. If the renewable guys were anywhere near as good we’d see 0 deaths from solar and wind. “The solar panel didn’t kill him, it was gravity and a slippery ladder”
His point is not that nuclear power is not dangerous, his point is that according to his theory, humans will get complacent and walk-back regulations when nothing happens, leading to potential nuclear accidents that were needless.
Arguably, non-nuclear fossil power is just as dangerous due to less regulation (pollution, climate change). And I wouldn't discount second-order effects from renewables either. And dam breaks can be far deadlier than nuclear accidents.
We are not really "fit" for any technology, but lack of technology wouldn't keep us safe from existential risks either. The solution is to transcend our biological substrate altogether.
> Because nuclear power has been around for a long time already, and the industry remains extremely regulated.
The industry was very heavily regulated globally before Chernobyl and Fukushima, but look what happened.
I’m not saying I’m against nuclear power (as I’m not). I’m not saying that any industry regulation is generally ineffective most of the time, especially when financials and greed gets involved (even though it appears to be true). I’m just saying humans are bad at moderating risk over long periods of time.
> The industry was very heavily regulated globally before Chernobyl and Fukushima, but look what happened
Gross incompetence and huge design issues 40 years ago on one hand, and complacency+incompetence that required a huge natural disaster to pose a problem, and even then it resulted in less death and destruction than the evacuation, the natural disaster itself or your regular fossil fuel power plant? Not sure that's a good example.
> The industry was very heavily regulated globally before Chernobyl and Fukushima, but look what happened.
Apart from Chernobyl which was poorly designed and managed to begin with, other plants have proven safe. Even Fukushima hardly caused any death while several thousands people were killed by a once-in-a-lifetime tsunami. Let's put things in perspective, please.
I do not know what you have in mind, but to my understanding Chernobyl was pretty much as bad as it can get. And the other energy sources... Well, if we are allowed to stretch our imagination on the potential worst case, I start thinking about fossile fuels, greenhouse gas effect and Venus. Or maybe genetically modified energy crops that break ecosystems...
The biggest coal, gas and dam failures were all pretty bad and often on a level with nuclear.
The 'worst case' with a modern reactor is basically zero to very few people die and there is some leaked out radiation. This is what happened in Fukushima. Its hard to find a worse case with a modern reactor.
Fine if humanity will never have to tackle anything complex for it's own survival, e.g. deflecting an asteroid on a collision course with earth (or evacuating the planet before it does), dealing with global warming, or some other planetary change (natural or otherwise), dealing with overpopulation / resources etc.
It would also require every other nation on earth to agree with the restriction, which isn't really compatible with the premise in the first place; if the regulations are self-imposed, people will stop following them.
I think it’s an optimization problem and incentives problem. We should be reevaluating level of funding over time, otherwise government organizations will grow indefinitely, but here’s where the incentives come in play, who would be interested in doing this evaluation gradually and with the backing of data? I believe the latter depends on the society and the politics.
> It's this sort of regulatory boom and bust that makes me feel humans shouldn't be allowed to do anything that's dangerous on a really large scale (such as nuclear power).
For more thinking along these lines, see the book “Normal Accidents: Living with High-Risk Technologies” (1984) by sociologist Charles Perrow (concerning nuclear power), the book “The Limits of Safety — Organizations, Accidents and Nuclear Weapons” (1993) by Scott Sagan concerning nuclear weapons, and the work of philosopher Hans Jonas on the “imperative of responsibility” (1980s).
I still believe capping executive pay is a necessary (though not sufficient) part of any practical solution. At least for publicly traded companies, we must cap compensation at no more than 50x minimum wage or something. No, executive compensation does not "come from a different pot" or any such nonsense. Don't listen to such silly arguments. Literally no shareholder wants to pay executives more unless they are executives at a different company where their buddies are on the board.
Some HN readers apparently don't like to hear this but I don't see any other way.
As for how we can reduce regulatory capture, I don't believe there is any good solution other than constant vigilance.
- Personal liability for making false or misleading statements to a regulatory agent
- Corporate death penalty. No more "too big to fail" nonsense. If there were actual competition among aviation manufacturers, the government could just dissolve the company and auction off the product lines to competitors.
Safeguarding against capture from the regulatory side is more difficult. But I guess term limits for decision-making positions could help, or having a good process for identifying and declaring conflicts of interest?
Not only would such a corporate death penalty never, ever get used against Boeing by the USG if such a thing were to exist, the USG would not advance such a circumstance where it could exist in the first place.
There are a number of industries and companies that will exist as long as the current USG does, at least for our lifetimes, due to decisions that were made long ago.
Examples include Lockheed, Boeing, and Microsoft.
Any solution to the problem needs to accept and address the levels of deep integration between these companies and the state.
> I still believe capping executive pay is a necessary (though not sufficient) part of any practical solution. At least for publicly traded companies, we must cap compensation at no more than 50x minimum wage or something.
I am not sure what capping executive salaries would achieve. You'd still get the same kind of people at the top anyway - even if the top salaries were identical to that of a janitor, the fact that you have power over others is its own motivator for many people out there. Proof in point: politicians don't make outrageous amounts of money (compared to CEOs), but they are addicted to power and control just as well.
The goal is to increase minimum wage to fifteen dollars an hour which will allow USD 1.5M annual compensation (15 * 2000 * 50)
I think a fifty times multiplier is already too high though so I don't know if I can agree to pushing that up any further. The goal is specifically to change things for the better, to NOT maintain the status quo.
Do you not think executive pay in the aggregate directly correlates to the value they provide to the company? Clearly not in terms of their individual labor, but the value of the unique guidance and leadership they provide. Of course there are plenty of examples of executives being paid too much, but those companies are providing opportunities for competitors with more appropriate pay structure to undercut them. In the long term, in a free market this should only be an issue with monopolies which are already regulated.
>The idea is that risk taking that jeopardizes lives has been encouraged but outsized rewards
That doesn't make much sense though. The size of rewards might incentivize how hard someone works for them and to what extremes, but the objective they're working towards depends on what the reward is, well, rewarding, which is a separate thing. It's perfectly possible to imagine a system where executive compensation was based purely around safety for example, say "$5 million for no flaw-derived fatalities during tenure of 10 years and another $5 million for no flaw-derived fatalities for anything manufactured during tenure for 10 years after retirement", or whatever exact numbers/details you wish. Of course, even that could be "outsized" in that too high+unbalanced a level might encourage the extreme of too little innovation. But merely saying "executives shouldn't get paid more than [number pulled out a random because who knows]" doesn't seem like a directed policy that'd help. And in fact is quite contrary to other comments calling for more personal responsibility and the like.
It's a perverse incentive; rather than acting for shareholders or the community the scale of CEO reward encourages action to preserve the reward. The scale of CEO pay demands that the only objective is to remain in post. I have actually witnessed this - executives sacrificing their friendship, relationships to children, business performance and health in order to cling on and get a £3m bonus.
>I’d see involving more skin in the game as the solution
like holding family members hostage? Or perhaps violent mutilation on failure? The financial route has been ramped up as far as can be imagined - $100m's are paid to CEOs to fail, and to wreak the economy, society and the planet at the same time.
> with current electoral law they are unlikely to ever be dislodged.
Fortunately, the individual states are not prevented from implementing reforms like Ranked Choice Voting, which would allow new parties to form and gain a reputation in politics at least at the state level.
If a party repeated that success across multiple states, that could be used to pressure the largest two parties to support similar reforms, like making Congressional districts return multiple representatives (proportional to the vote within that district).
A big part of the solution that does not involve the government is the market reaction driven by sometimes irrational consumer behavior. Some passengers are now weary of all recent Boeing designs, and even of the excelent 737. The margins are razor thin and if 5% decide to book elsewhere, it can turn a profitable route into a flop.
So Boing will pay not only the massive costs of withdrawing the MAX since orders for it are plummeting. But this event will affect all its business. This is why there is lobbying/political pressure to divert at least some of the blame, the implications are in the tens of billions in the next few years.
So the good news is that the company will be forced to clean-up its act voluntarily, a similar safety disaster could mean extinction.
I wish that kind of free-market-driven correction were possible, but passengers have no control over which airplanes they fly in. You don't book tickets with Boeing or Airbus; you book tickets with Southwest or Spirit, and even if they tell you the make of the plane you'll be flying, they can change it up at any time without refunding you.
How do we reduce corporatocracy and regulatory capture?
There are elections coming up in November. You'll have at least 2 federal races on the ballot. Figure out where the candidates stand with regards to corporatocracy and regulatory capture and vote accordingly.
If you don't like where any of the candidates stand, there will be another election in 2 years. In the mean time, let the candidates know how you feel. Donate to, or volunteer for the candidates who you want to win.
The FAA delegates oversight to non-FAA contractors working and being paid at Boeing. If they don't agree with Boeing mgmt., they are fired.
Also, the FAA manages safety using paperwork. The submitted paperwork did not match the excessive MCAS trim rate.
"The problem is it was compliant and not safe." is a false statement. MCAS performance did not match the paperwork filed with the FAA.
The Republican comment about it not being a system problem is also false. The FAA has a history of losing control until an accident. The result of the 1956 Grand Canyon accident was that they determined the ATC system "was not a system." In the case of the 737 MAX, the FAA latitude given to Boeing allowing it to outsource everything and self-certify bad designs is a system failure for airliner design and mfg.
Midair collision between a Trans World Airlines Lockheed 1049A and a United Airlines Douglas DC-7
I understand there were problems at Boeing and the FAA that lead to the unexpected first crash. I'm honestly not that bothered that a plane crashed (besides the general sadness of any loss of life tragedies). Sometimes bad things happen... these are complex machines and regulatory systems. The process seems to be working and problems are being addressed. Plane crashes are super, super rare.
What I don't understand, and does bother me, is why the plane continued to fly after the first crash and even, inexcusably, for days after the second crash! That seems to me a bigger indictment of Boeing's and the FAA's reluctance to put safety over money.
I think any engineer who won't refuse to work on dangerous projects and not take precautions is as guilty as the manager pushing for any unsafe changes. You have to step up and let them know you'll leave if they don't change. Talk to their managers if you don't have to. Save your emails too. Don't be party to stuff like this, a big spot of blood will be on your shirt at the end of the day if someone (or multiple someones) dies as a result. No one can forsee everything but if you do and you don't do anything about it you are as guilty as anyone down the chain of "unfortunate situations" that allowed it to happen.
As with so many "root cause" investigations, there are deeper issues the more you dig. And these issues turn out to be both a gradual evolution of circumstances that changed
Boeings, Congress's, and our collective responsibility for the matter when you peel back the layers.
It turns out (in my view) that this is just the inevitable result of a slow abandonment of the role that the military, federal government, and the US people elected to play in the development of civil aviation in the last century.
Maybe many have forgotten, but our civil aviation legacy largely came from R&D and production of aircraft during WW2 and later. Boeing, McDonnell, Grumman, Northrop, these were all companies that formed from that legacy. But what they produced besides planes was a government infrastructure that was expert in procuring, regulating, and evaluating the performance of not just aircraft but also companies.
And it also produced aircraft companies that worked closely with government -- but most importantly, with their concerns in mind. They were partly the customer!
Over 50 years, the pressures of public debt, cost of employees, efficiency, etc. meant that that expertise in the regulatory bodies gradually began to be hollowed out. Experts in government found themselves too bothered by the heavier and heavier constraints of government, and lured by the higher salaries of the private sector. The leaders of a new field were replaced by mere maintainers of it. We all know what happens as that changes, I think.
Government gradually also became less of a "customer" in the design and production of planes. And the airplane companies themselves became more profit-need-driven. They are basically like the auto manufacturers with huge workforces that need their insurance and IAM wages paid.
So what do you get in a situation like this? The inevitable:
Aircraft manufacturers that start to optimize their designs and production for low cost and "simple" variations on old designs (don't want to invest in from-scratch new planes). Regulators who don't know how to evaluate properly new designs, and anyway whose responsibilities are basically staffed for and by the airline because few people want to be regulators. And a public that incentivizes this all because we have other debts to pay and don't want to cough up the $ in ticket prices or taxes.
Until a plane crashes.
Anyway, that's my take. So if they were honest, Congress would point the mirror at themselves too, in this exercise.
Despite the fetishization of “real engineering” and “standards”, once again we come down to the bare truth: software engineers are the only ones with true ethical standards. Easily measured by numbers of people killed. In fact, web front-end engineers are far more ethical than any others. If you work at Slack you are a better engineer than Boeing aeronautics. Never forget this.
Well, drones don't have any Slack instances running on them. If you want to exclude all defence contractor software-engineers from the list of software-engineers that's fine. I am comfortable with that. Though I suppose when your job is killing people, killing people is a sign of competence. I think Boeing wasn't trying to kill people with the 737 Max, but I'm no expert on corporate psychology.
The important thing to remember is that true engineering is building Electron and React apps not whatever nonsense all these guys who go around constantly accidentally killing people do. Aeronautics, such a poor field with poor standards. Any guy using React+Redux is a superior engineer. After all, he won't be killing anyone by accident.
I know you’re trolling, but even your troll argument is flawed. WayFair sold furniture (ahem, cages) to ICE. Facebook monetizes murderers with ads. People suck. Engineers in general are trying to make a better world. Sure “better” might mean “my missile is better than my enemy’s missile”. Or “my ads are more engaging than my competitor”. But engineers are building things. Through and through, engineers build. Not too many professions can honestly say they create.
Oh sure, you're going on the Evil/Good angle. That's fine. But let's set it aside for the moment because I will probably just agree with you on whichever stance you take, since I don't particularly care. I guess I did confuse the matter by using the words 'ethics' but I meant "not presenting your ability as far beyond what it is" so perhaps I can clarify and we can move on from that.
I'm going on the competent/incompetent angle which is orthogonal. The problem is that those guys just aren't good engineers. If they want to kill people and they're doing it, or they want to sell to ICE and they're doing it, or they want to monetize murders and they're doing it, then they're good at it. That's competence.
Aeronautics engineers, though. They want to make things fly and they fall instead. I don't think they've wanted to kill Boeing passengers, but you know, considering their skill, perhaps they did, perhaps they did. In which case, you're right, that's a field of highly competent passenger killers who we thought were incompetent passenger fliers!
Web developers writing React on the other hand? Zero people intended to be killed. Zero people killed. Billions of dollars of value. Creativity. Ability. Competence. The virtues of a real engineer. Perhaps one day other fields can emulate their techniques to understand how they do it.
I still don't understand how you guys managed to build a country where a major defense contractor and a government agency can be found guilty. There's just too many reasons for this to fail to all the different level of corruption; I look at this marvelling at how the hell does it still work.
If you don't see it as an incredible achievement and don't feel proud of it, you don't have any perspective on what's the other parts of the world are like, and were like for the most part of history. Don't take things like this for granted.
They're not found guilty, government officials have "found that mistakes were made and the culture is rotten". The first brings legal punishments, the second can be waved away and swept under the rug with promises like "we'll do better next time". Maybe a few pawns will be fired to appease the public.
You've actually got a decent point, even if you haven't exactly grasped the nuances of what is going on. This is more of a public statement of intent that may turn into something, or may not. It's just the assembly making a statement. A conviction in the court of public opinion does not a conviction in a court of justice make, or an actual remediation of the regulatory process create. What it means for the parties involved is far from settled even though the entry in the history books is written. This'll make good fodder for campaigns, may shift the color of up and coming civil servants' careers, and will almost certainly pave the way for any type of criminal probes going on. Even with those, however, the criminal proceeding must be conducted against specific individuals, who will be exceedingly difficult to pin down beyond a reasonable doubt, or otherwise against the company itself likely on some other much lesser charge that will likely get negotiated down to something not terribly satisfying in the grand scheme of things. I mean, look at PG&E for an example of the outcome of a corporate actor being found guilty. Boeing (or at least it's civil aviation aspect) has much more to pull from in terms of slush expenditure to dedicate to softening the blow of this hiccup short of concerted political action or punitive lawmaking.
It's a sign. Encouraging to some, but empty, dust filled, and tasteless to many who expected much more of the system. I'm hoping these deaths will not be in vain and that this may kindle a political will to encourage greater corporate accountability and responsibility down the road; but beyond doing my small part I can't say as I entertain an excess of confidence of that happening. No offense meant, but it is also cold comfort indeed to be taking solace from, "we could be even more corrupt".
Then again, maybe, just maybe we'll manage to pull a rabbit out of this.
After all, it was an act of Congress, and if a bunch of people have done it, well...
Stands to reason yet another group of folks can do it.