singularityweblog.com/stuart-armstrong-existential-risks
Stuart Armstrong is a James Martin research fellow at the Future of Humanity Institute at Oxford where he looks as issues such as existential risks in general and Artificial Intelligence in particular. Stuart is also the author of Smarter Than Us: The Rise of Machine Intelligence and, after participating in a fun futurist panel discussion with him - Terminator or Transcendence, I knew it is time to interview Armstrong on Singularity 1 on 1.
During our conversation with Stuart we cover issues such as: his transition from hard science into futurism; the major existential risks to our civilization; the mandate of the Future of Humanity Institute; how can we know if AI is safe and what are the best approaches towards it; why experts are all over the map; humanity's chances of survival...
My favorite quote from this interview with Stuart Armstrong is: "If we don't get whacked by the existential risks, the future is probably going to be wonderful."
Социальные закладки