The Big Red button

When they developed the atom bomb, some of the scientists on the development team speculated that the explosion might ignite the atmosphere and end all life in earth. So they argued endlessly over the risk and the consequences.

In the end they pushed the button. 

There were  many in that bunker who truly believed that there was a good chance that they could end the world and everything on it.

More recently the world went through a similar choice. In the search for the Higs Boson particle the large hadron collider at Cern was fired up. This machine was the largest most complicated machine ever produced by humanity and there were many who tried to warn us of possibly catastrophic failure, some of these  actually worked at Cern. Some of the critics even helped design and build the device. 

They argued that we had no way to know if the smashing together of high speed particles would not create anti matter or even a tiny black hole. Anti matter and regular matter that our universe is made from annihilate each other, violently, when in contact. We don't have any practical way to hold, or contain anti matter if we did produce it in significant quantity. The idea was to use magnetism to suspend a tiny bit of it. Everything we have is made of matter which can't be used to isolate and contain anti matter. The resulting explosion could have destroyed the earth. Ditto if we ended up with a tiny black hole. 

We want to know what the big red button does, and there is nothing in this world to stop us pressing it. 
No matter the risk. 


When it comes to bio engineering and working with deadly viruses we perform experiments that have the potential existential risks. 

History is full of well meaning scientists who wonder what lay hidden behind the locked door and what that blinking red button did. It's the same curiosity that created the search for knowledge in the first place. 

Our history of accidental spillage at research facilities, breach of containment at labs dealing with deadly pathogens and just plain stupidity has not given us any confidence in the precautions taken. As humans, our ambition is often only matched by our ineptitude and the bigger the project the more contractors in its creation which leads to complexity on a scale that has never existed before. The complexity is complicated by the fact that the work often involves teams from different countries that work across distance, language and the imperial vs metric systems. There is much room for failure. 

The irony of the specific risk that we  discuss today is not what would happen if they fail but what could happen if they succeed. That they could fail, or have fallout from human error, was a given. 
Death by mad scientist. 

Nothing discussed here should be read to mean we should not play with forces that are bigger than us. If you must rake risks with our planet and all life on it we need to at least take a moment to consider the risk. 

The big question is do we, the custodians of this world deserve to know what the dangers are, every time you push the red button or are we likely to panic and try and stop the progress of the pursuit of knowledge. 


Should the library even be allowed to keep books on the dark arts. Why should such books exist. 

Is some knowledge too dangerous. 

All the major players in this opera have invested heavily on the two new frontlines. Ai and space travel. 

With Ai we don't yet know the magnitude of how much we don't know. While we are sure we know enough to get things started we can see some of the obvious dangers of creating an intelligence that exceeds our own, but we are unable to resist the urge to push that button. And press it we will. 

Every civilisation has progressed at a pace that sees incremental improvement in the technology over the ages. This is the way we add the many layers of our technology over the short lives we are given. Who can predict the actions of future generations that take up the mantle that our generation leaves them?


While this multi layered progress that spans time and space is ongoing in all areas and sciences, every so often we encounter a new concept that can only be described as a leap. The splitting of the atom, the ability to reach escape velocity to go into orbit the manipulation of DNA,   and the creation of general Ai. 

We don't even know how much we don't know, yet we press the button. Every time. 

The list of existential threats to humanity from science is not exhaustive but just what's on my mind. If we add the risk from contact with extra terrestrial life this might get weirder and perhaps risk losing the readers and my credibility, so we can limit our exercise to the here and now. 

Despite the fact that they had to have known all of this,  they push the button anyway, every time. 

š“œ š“Ÿš“Ŗš“»š“Ŗš““ 
Dec 2021












Comments

Popular Posts