Singularity

The threat of superintelligence is to Matheny far worse than any epidemic we have ever experienced. “Some risks that are especially difficult to control have three characteristics: autonomy, self-replication and self-modification. Infectious diseases have these characteristics, and have killed more people than any other class of events, including war. Some computer malware has these characteristics, and can do a lot of damage. But microbes and malware cannot intelligently self-modify, so countermeasures can catch up. A superintelligent system [as outlined by Bostrom] would be much harder to control if it were able to intelligently self-modify.”

Meanwhile, the quiet work of these half dozen researchers in labs and study rooms across the globe continues. As Matheny puts it: “existential risk [and superintelligence] is a neglected topic in both the scientific and governmental communities, but it’s hard to think of a topic more important than human survival.”

There are those who intend to gleefully embrace (their idea of) the singularity.

And those, like Matheny, who see one specific problem.

But chances are the singularity will be compendium of things; a hacked SCADA; loss of governmental control over our fleet of drones; AI evolving into a self-contained, alien entity: a witch’s brew of technology and climate change.

Which is why organizations like 100 Year Starship are important 1: an actual starship may never leave this galaxy, but the ways and means of surviving such a flight can easily be adapted to survivors of a catastrophe on Earth.

Which is also why a chance to go to Mars is also profound.

Grab your chance to leave Earth here.

Singularity

Show 1 footnote

  1. Yes, even if it’s a DARPA initiative.

Something to say...?