This could also be related to a broader tendency to promote 'performers' who are more likely to take risks or shortcuts that they might not realize involve risks as well as people that use less resources (lower safety margins, less overlapping checks etc).
It's sadly difficult to be recognized for excellence in preventing surprises, as hard as it is to quantify that.
I think this is absolutely part of the issue. Having previously worked in the industry, people who bring up concerns are sometimes viewed as pariahs who are slowing down work. Because so many of the concerns involve low-probability events, it's possible for someone to make a career rolling the dice without being cognizant of (or open about) the risks. When bad things do happen (thankfully, major catastrophes are still relatively rare), it's hard for people to openly recognize the mitigations that could have prevented it because they think instituting them on future projects will just slow things down. It creates a culture of "the ends justify the means" where bad judgement and integrity violations are considered ok as long as the project/program was completed.
It's sadly difficult to be recognized for excellence in preventing surprises, as hard as it is to quantify that.