Singularity Summit as part of National Science Week

On July 19, 2012, in News, by Adam Ford

National Science Week encourages creativity and innovation when planning an event for the Week’s festivities.

Singularity Summit Australia on Science Week Website

The Singularity Summit Australia seeks to fulfill the National Science Week objectives to:

  1. Promote and encourage interest in the areas of science, engineering, technology and/or innovation; and
  2. Communicate the relevance of this area in everyday life

Obviously the topics covered at the summit are meant to convey big picture issues, like future of intelligent life, and threats to humanity. Technologies that may fundamentally change the human condition.

We will be dealing with subsidiary questions that arise when you try to think about these things:

  • How is it even possible to even know anything about these things?
  • Can we make a study of these questions that amount to more than an expression of our prejudices and biases?
  • Are existential risks too large for people to grasp, and instead focus on smaller risks or the types of risks that humanity has faced before?

The seriousness of a risk is a product of the probability of the risk, and the magnitude of its impact. Existential risk, or existence risk, is the type that effects the human species as a whole. The case can be made that existential risk reduction is the most important task for humanity. If you could reduce existential risk by 1/10th of a percent, it could trump other reductions in risk (like curing cancer).
Even so, it is not clear that probabilities of existential risks are so small given a long enough timespan – i.e. advanced biotech, nanotech and AI. It is time to really think these types of risks, not only the smaller scale risks, like road accidents or drug problems, but the huge risks that threaten the very survival of the future of our species.

In terms of how and who should be thinking about these risks, is this something the government should be focusing on, the local community, individuals, who should be focusing on these risks?
So we are still at early days in the studies of existential risk. It is certainly possible to conceive of scenarios where attempts to reduce existential risk actually increase it. Most of the really big existential risks in my view will arise from human activity, and through future technologies. It gets a lot harder to know what we can actually do to mitigate risk, for instance to make a dent in the probability of a biotech weapon going out of control or an advanced AI acting on self-created goals chat are contrary to humanities goals. Risks relating to AI/Super-intelligence could arise from relatively small portions of focused resources. There are very tricky problems in working out the dynamics of how we control an advanced AI, extremely intelligent and therefore extremely powerful entity. Very tricky problems here (decision theory etc). So it seems very important to work at solving and ironing out the dynamics of how to control or seed an AI, so that by the time we actually have the ability to create such an AI – we know how to mitigate associated risks, the control problem etc. These sorts of questions, are similar in such that they can be applied to other areas etc.
Though it is difficult to know how to solve specific problems, like forging a global immune system against synthetic bio-weapons, rampant nanotech or the discouraging an AI to terraform the planet, certainly things like international peace, national collaboration, that would help foster global coordination, careful deliberative style of thinking and foresight would provide a landscape, an environment where solving large scale problems would be easier. These sorts of global skills may not just be desirable but possibly necessary if we are to solve some of the major problems we might face this century.

Comments are closed.



Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can
take care of it!