Épisodes

  • Trailer
    Oct 17 2018

    We humans could have a bright future ahead of us that lasts billions of years. But we have to survive the next 200 years first.

    Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

    See omnystudio.com/listener for privacy information.

    Voir plus Voir moins
    2 min
  • Trailer 2: Bill, Elon and Stephen
    Oct 24 2018

    Why are smart people warning us about artificial intelligence? As machines grow smarter and able to improve themselves, we run the risk of them developing beyond our control. But AI is just one of the existential risks emerging in our future.

    Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

    See omnystudio.com/listener for privacy information.

    Voir plus Voir moins
    1 min
  • Fermi Paradox
    Nov 7 2018

    Ever wondered where all the aliens are? It’s actually very weird that, as big and old as the universe is, we seem to be the only intelligent life. In this episode, Josh examines the Fermi paradox, and what it says about humanity’s place in the universe. (Original score by Point Lobo.)

    Interviewees: Anders Sandberg, Oxford University philosopher and co-creator of the Aestivation hypothesis; Seth Shostak, director of SETI; Toby Ord, Oxford University philosopher.

    Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

    See omnystudio.com/listener for privacy information.

    Voir plus Voir moins
    37 min
  • Great Filter
    Nov 7 2018

    The Great Filter hypothesis says we’re alone in the universe because the process of evolution contains some filter that prevents life from spreading into the universe. Have we passed it or is it in our future? Humanity’s survival may depend on the answer. (Original score by Point Lobo.)

    Interviewees: Robin Hanson, George Mason University economist (creator of the Great Filter hypothesis); Toby Ord, Oxford University philosopher; Donald Brownlee, University of Washington astrobiologist (co-creator of the Rare Earth hypothesis); Phoebe Cohen, Williams College paleontologist.

    Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

    See omnystudio.com/listener for privacy information.

    Voir plus Voir moins
    43 min
  • X Risks
    Nov 7 2018

    Humanity could have a future billions of years long – or we might not make it past the next century. If we have a trip through the Great Filter ahead of us, then we appear to be entering it now. It looks like existential risks will be our filter. (Original score by Point Lobo.)

    Interviewees: Nick Bostrom, Oxford University philosopher and founder of the Future of Humanity Institute; David Pearce, philosopher and co-founder of the World Transhumanist Association (Humanity+); Robin Hanson, George Mason University economist (creator of the Great Filter hypothesis); Toby Ord, Oxford University philosopher; Sebastian Farquahar, Oxford University philosopher.

    Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

    See omnystudio.com/listener for privacy information.

    Voir plus Voir moins
    39 min
  • Natural Risks
    Nov 14 2018

    Humans have faced existential risks since our species was born. Because we are Earthbound, what happens to Earth happens to us. Josh points out that there’s a lot that can happen to Earth - like gamma ray bursts, supernovae, and runaway greenhouse effect. (Original score by Point Lobo.)

    Interviewees: Robin Hanson, George Mason University economist (creator of the Great Filter hypothesis); Ian O’Neill, astrophysicist and science writer; Toby Ord, Oxford University philosopher.

    Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

    See omnystudio.com/listener for privacy information.

    Voir plus Voir moins
    37 min
  • Artificial Intelligence
    Nov 16 2018

    An artificial intelligence capable of improving itself runs the risk of growing intelligent beyond any human capacity and outside of our control. Josh explains why a superintelligent AI that we haven’t planned for would be extremely bad for humankind. (Original score by Point Lobo.)

    Interviewees: Nick Bostrom, Oxford University philosopher and founder of the Future of Humanity Institute; David Pearce, philosopher and co-founder of the World Transhumanist Association (Humanity+); Sebastian Farquahar, Oxford University philosopher.

    Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

    See omnystudio.com/listener for privacy information.

    Voir plus Voir moins
    42 min
  • Biotechnology
    Nov 21 2018

    Natural viruses and bacteria can be deadly enough; the 1918 Spanish Flu killed 50 million people in four months. But risky new research, carried out in an unknown number of labs around the world, are creating even more dangerous humanmade pathogens. (Original score by Point Lobo.)

    Interviewees: Beth Willis, former chair, Containment Laboratory Community Advisory Committee; Dr Lynn Klotz, senior fellow at the Center for Arms Control and Non-Proliferation.

    Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

    See omnystudio.com/listener for privacy information.

    Voir plus Voir moins
    57 min