Preparedness paradox

The preparedness paradox is the proposition that if a society or individual acts effectively to mitigate a potential disaster such as a pandemic, natural disaster or other catastrophe so that it causes less harm, the avoided danger will be perceived as having been much less serious because of the limited damage actually caused. The paradox is the incorrect perception that there had been no need for careful preparation as there was little harm, although in reality the limitation of the harm was due to preparation. Several cognitive biases can consequently hamper proper preparation for future risks.

Background
The term "preparedness paradox" has been used occasionally since at least 1949 in different contexts, usually in the military and financial system. The term regained traction in reference to the Covid-19 pandemic and to the overall government response worldwide.

Another notable citation of the term was in 2017 by Roland Berger regarding executives in the aerospace and defense industry: almost two thirds of those surveyed reported that they were well-prepared for geopolitical changes, about which they could do nothing, while feeling unprepared in areas such as changes in technology and innovation, to which they should be much more able to respond. In contrast, other surveys found that boards and financial professionals were increasingly concerned about geopolitical risk. Berger concluded that there was an urgent need for more and better business strategies throughout industry to close this gap in preparedness.

Cognitive biases
Organisms with faster life histories and shorter lives are disproportionately affected by chaotic or hostile environments. These types of organisms innately have a greater fear of environmental disasters or emergencies. However, organisms with slower life histories, such as humans, may have less urgency in dealing with these types of events. Instead, they have more time and ability to prepare for such emergencies.

Cognitive biases play a large role in the lack of urgency in preparation, hampering efforts to prevent disasters. These include over-optimism, in which the degree of disaster is underestimated, and the fact that many disasters do not reach their breaking point until it is too late to take action. In over-optimism and normalcy bias, people believe that disasters will happen elsewhere, and even if they do happen locally only their neighbors will be affected.

Another obstacle to preparedness is the interval between disasters. When there is a long time between disasters, there is less urgency to prepare. This is due to fewer people remembering the last disaster, which reduces its emotional impact on the group. This effect is heightened when some measure of action is taken to prevent the disaster, which further reduces the memory of the original danger and consequences.

Financial concerns can also contribute to the preparedness paradox. There is a tendency to over-value known short-term costs, as well as to under-value unknown long-term rewards. The fact that preparing for disasters is expensive in the short term and its value in the long term cannot be determined could lead to catastrophic consequences if the choice is made to not prepare.

Examples
Levees are structures which run parallel to rivers and are meant to offer protection from flooding. The long-term safety afforded by them can lead to the misperception that an area protected by them is "flood free", leading to unsafe land development in the floodplain.

Preparing for a pandemic is an example of the preparedness paradox. If sufficient investment in preparation meant that a small outbreak could be identified and contained before it could spread, then no mass deaths or economic failure would occur as a result, and there would no clear evidence that the preparation had been warranted.

Earth's ozone hole and acid rain were significantly mitigated, but in 2022, right-wing political commentator Matt Walsh pointed to the current lack of public discourse on them to suggest that those threats were never anything to worry about.

Historical perspective can also contribute to the preparedness paradox. From the point of view of historians after the Year 2000 problem, the preventative action taken has been described as an "overreaction", instead of a successful effort to prepare for an upcoming problem.