Bioterrorism risks for our civilization Euresa System

Bioterrorism: risks for our civilization ?

In the daily hubbub of the crises that the’humanity is facing, we forget the many generations we hope are still to come. Not those who will live in 200 years, but in 1,000 or 10,000 years. J’We use the word “hope” because we are facing risks, called existential risks, that threaten to destroy our future’eradicate the’humanity.

These risks are not just about major disasters, but also about disasters that could end the company’s existence’story. We are in a more privileged position today’today.

The risks around Bioterrorism

L’human activity has consistently shaped the’future of our planet. And while we are far from controlling natural disasters, we are developing technologies that can help mitigate them, or at least manage them.

Imperfect future

These risks remain understudied. There’s a sense of’helplessness and fatalism about them. People are talking about’apocalypses for millennia, but few have tried to prevent them.

Humans also have difficulty dealing with problems that have not yet occurred. If the’Humanity disappears, the loss equals at least the loss of all living individuals and the frustration of their goals. But the loss would probably be much greater than that. L’Human extinction means the loss of meaning generated by past generations, the lives of all future generations and all the value that they have in their lives’they could have created.

If consciousness or the’If consciousness or intelligence are lost, it could mean that value itself becomes absent from the human experience’universe. This is an enormous moral reason to work hard to prevent existential threats from becoming reality. And we must not fail once in this pursuit. In the last century, we have discovered or created new existential risks.

Super volcanoes were discovered in the early 1970s.

Nuclear warfare

While only two nuclear weapons have been used in the war until now, the first one is a nuclear weapon’Now that nuclear weapons were used in Hiroshima and Nagasaki during World War II and nuclear stockpiles are down from their Cold War peak, thinking that nuclear war is impossible is a mistake. In fact, it might not be unlikely. The Cuban missile crisis was about to go nuclear.

If we assume one such event every 69 years and a one in three chance of it becoming a nuclear war, the chances of such a catastrophe increase by about one in 200 per year. Even worse, the Cuban missile crisis was only the most famous case. The history of Soviet-American nuclear deterrence is full of intimate appeals and serious mistakes.

A large-scale nuclear war between great powers would kill hundreds of millions of people, directly or in the following days in an unimaginable catastrophe.

The real threat is nuclear winter, i.e., soot floating in the stratosphere, causing the world to cool and dry up over several years. Modern climate simulations show that this could prevent the loss of life’agriculture in much of the world for years to come. If this scenario were to occur, billions of people would starve to death, leaving only scattered survivors who could be captured by d’other threats such as disease

Bioengineering pandemic

Natural pandemics have killed more people than wars. However, natural pandemics are unlikely to be an existential threat: some people are generally resistant to the disease’the pathogen and the offspring of the survivors would be more resistant. Nor does evolution favor parasites that annihilate their hosts. This is why syphilis went from a virulent killer to a chronic disease spreading in Europe.

Most of the work on biological weapons has been carried out by governments in search of controllable solutions, because eliminating them would not obey moral rules’humanity n’is not useful to us’a military point of view. But there are always people who might want to do things because they could have created a better world’they can. D’others have higher goals.

On intelligence

L’intelligence is very powerful. Being intelligent is a real advantage for people and organizations. This means that considerable effort is needed to find ways to prevent them’We need to improve our individual and collective intelligence. The problem is that intelligent entities do achieve their goals, but if the goals are poorly defined, they can use their power to intelligently achieve disastrous goals. There is no’here is no reason to believe that the world would be at risk’intelligence itself will make something behave nicely and morally. In fact, it is possible to prove that some types of super-intelligent systems are not as likely to be able to cope with a pandemic as others’would not obey moral rules, even if they were true.

There are good reasons to believe that certain technologies can accelerate things much faster than current companies can. We also don’t know how dangerous the different forms of over-intelligence are, or what strategies to use’mitigation would actually work. It is very difficult to reason about the future technology that we will not be able to use’What benefits can a company really get from cloud computing?.

Nanotechnology

Nanotechnology is the control of matter with atomic or molecular precision. In itself, it is not dangerous. This would be very good news for most applications.

The problem is that, like biotechnology, increasing power also increases the potential for abuse that is difficult to defend. The most obvious risk is that atomic precision manufacturing seems ideal for the quick and cheap manufacture of things like weapons. In a world where any government could “print” large quantities of autonomous or semi-autonomous weapons.

The arms race could become very fast and therefore unstable, as a first strike before the enemy becomes too big could be tempting.