Bioterrorism risks for our civilization

Bioterrorism: risks for our civilization ?

In the daily hubbub of the crises that the’humanity is facing, we forget the many generations we hope are still to come. Not those who will live in 200 years, but in 1,000 or 10,000 years. J’uses the word “hope” because we are facing risks, called existential risks, that threaten to destroy our future’eradicate the’humanity. These risks are not just about major disasters, but also about disasters that can end the world’story.

We are in a more privileged position today’Today.

The risks around Bioterrorism

L’human activity has constantly shaped the company’future of our planet. And while we are far from controlling natural disasters, we are developing technologies that can help mitigate them, or at least manage them.

Imperfect future

These risks remain little studied. There is a sense of’helplessness and fatalism about them. People are talking about’apocalypses for millennia, but few have tried to prevent them.

Humans also have difficulty dealing with problems that have not yet occurred. If the’When humanity disappears, the loss is at least equivalent to the loss of all living individuals and the frustration of their goals. But the loss would probably be much greater than that. L’Human extinction means the loss of meaning generated by past generations, the lives of all future generations, and all the value of the world’they could have created.

If the consciousness or the’If consciousness or intelligence are lost, it could mean that value itself becomes absent from the’universe. This is a huge moral reason to work hard to prevent existential threats from becoming reality. And we must not fail once in this pursuit. Over the past century, we have discovered or created new existential risks.

Super volcanoes were discovered in the early 1970s.

Nuclear war

While only two nuclear weapons have been used in the war up to now, there is no doubt that this is the case’Now that the nuclear weapons used in Hiroshima and Nagasaki during World War II are down from their Cold War peak, it is a mistake to think that nuclear war is impossible. In fact, it might not be unlikely. The Cuban missile crisis was about to go nuclear. If we assume one such event every 69 years and a one in three chance of it becoming a nuclear war, the chances of such a disaster increase by about one in 200 per year. Even worse, the Cuban missile crisis was only the most famous case.

The history of Soviet-American nuclear deterrence is full of intimate appeals and serious mistakes. A large-scale nuclear war between major powers would kill hundreds of millions of people, directly or in the following days in an unimaginable catastrophe.

The real threat is nuclear winter, i.e., soot floating in the stratosphere, causing the world to cool and dry up over several years. Modern climate simulations show that this could prevent the loss of biodiversity’agriculture in much of the world for years to come. If this scenario were to occur, billions of people would starve to death, leaving only scattered survivors who could be captured by other people’other threats such as disease

Bioengineering pandemic

Natural pandemics have killed more people than wars. However, natural pandemics are unlikely to be an existential threat: some people are generally resistant to the use of nuclear weapons’pathogen and the offspring of the survivors would be more resistant. Evolution also does not favor parasites that wipe out their hosts.

That’s why syphilis went from a virulent killer to a chronic disease spreading in Europe. Most of the work on biological weapons has been done by governments looking for controllable solutions, as eliminating the’humanity n’It is not useful to’a military point of view. But there are always people who might want to do things because they don’t know what to do’they can.

D’others have higher goals.

On intelligence

L’intelligence is very powerful. Being smart is a real advantage for people and organizations. Thus, considerable effort is needed to find ways to prevent the spread of nuclear weapons’improve our individual and collective intelligence. The problem is that smart entities are good at achieving their goals, but if the goals are poorly defined, they can use their power to intelligently achieve disastrous goals. It does not’There is no reason to think that the’The intelligence itself will make something behave nicely and morally.

In fact, it can be proven that some types of super-intelligent systems are not as good as others’Would not obey moral rules, even if they were true. There are good reasons to think that some technologies can speed things up much faster than current companies can. Similarly, we do not know how dangerous the different forms of over-intelligence are, nor what strategies to use them’mitigation would really work.

It is very difficult to reason about the future technology that we n’We have not yet or on intelligences greater than ourselves.

Nanotechnology

Nanotechnology is the control of matter with atomic or molecular precision. In itself, this is not dangerous. This would be very good news for most applications.

The problem is that, like biotechnology, the increase in power also increases the potential for abuse difficult to defend. The most obvious risk is that atomic precision manufacturing seems ideal for the quick and cheap manufacture of things like weapons. In a world where any government could “print” large quantities of autonomous or semi-autonomous weapons.

The arms race could become very fast and therefore unstable, as a first strike before the enemy becomes too important could be tempting.

Related Posts