Cover image of The End Of The World with Josh Clark
(4621)

Rank #197 in Science & Medicine category

Science & Medicine
Society & Culture
Technology
Natural Sciences

The End Of The World with Josh Clark

Updated 22 days ago

Rank #197 in Science & Medicine category

Science & Medicine
Society & Culture
Technology
Natural Sciences
Read more

We humans could have a bright future ahead of us that lasts billions of years. But we have to survive the next 200 years first. Join Josh Clark of Stuff You Should Know for a 10-episode deep dive that explores the future of humanity and finds dangers we have never encountered before lurking just ahead. And if we humans are alone in the universe, if we don't survive intelligent life dies out with us too.

Read more

We humans could have a bright future ahead of us that lasts billions of years. But we have to survive the next 200 years first. Join Josh Clark of Stuff You Should Know for a 10-episode deep dive that explores the future of humanity and finds dangers we have never encountered before lurking just ahead. And if we humans are alone in the universe, if we don't survive intelligent life dies out with us too.

iTunes Ratings

4621 Ratings
Average Ratings
4347
117
59
39
59

Wow!

By Landon boss hoe - Jun 14 2019
Read more

Completely engulfed my mind! Beautiful information.

Best Podcast I’ve ever heard

By Ggyrddjdfrikmddd - Jun 04 2019
Read more

So worth the time to listen

iTunes Ratings

4621 Ratings
Average Ratings
4347
117
59
39
59

Wow!

By Landon boss hoe - Jun 14 2019
Read more

Completely engulfed my mind! Beautiful information.

Best Podcast I’ve ever heard

By Ggyrddjdfrikmddd - Jun 04 2019
Read more

So worth the time to listen

Cover image of The End Of The World with Josh Clark

The End Of The World with Josh Clark

Updated 22 days ago

Rank #197 in Science & Medicine category

Read more

We humans could have a bright future ahead of us that lasts billions of years. But we have to survive the next 200 years first. Join Josh Clark of Stuff You Should Know for a 10-episode deep dive that explores the future of humanity and finds dangers we have never encountered before lurking just ahead. And if we humans are alone in the universe, if we don't survive intelligent life dies out with us too.

Warning: This podcast is a series podcast

This means episodes are recommended to be heard in order from the very start. Here's the 10 best episodes of the series anyway though!

Rank #1: Trailer: The End Of The World with Josh Clark

Podcast cover
Read more

We humans could have a bright future ahead of us that lasts billions of years. But we have to survive the next 200 years first. Learn more about your ad-choices at https://news.iheart.com/podcast-advertisers

Oct 17 2018
2 mins
Play

Rank #2: Trailer 2: Bill, Elon and Stephen

Podcast cover
Read more

Why are smart people warning us about artificial intelligence? As machines grow smarter and able to improve themselves, we run the risk of them developing beyond our control. But AI is just one of the existential risks emerging in our future. Learn more about your ad-choices at https://news.iheart.com/podcast-advertisers

Oct 24 2018
1 min
Play

Rank #3: EP01: Fermi Paradox

Podcast cover
Read more

Ever wondered where all the aliens are? It’s actually very weird that, as big and old as the universe is, we seem to be the only intelligent life. In this episode, Josh examines the Fermi paradox, and what it says about humanity’s place in the universe. (Original score by Point Lobo.) Interviewees: Anders Sandberg, Oxford University philosopher and co-creator of the Aestivation hypothesis; Seth Shostak, director of SETI; Toby Ord, Oxford University philosopher. Learn more about your ad-choices at https://news.iheart.com/podcast-advertisers

Nov 07 2018
39 mins
Play

Rank #4: EP02: Great Filter

Podcast cover
Read more

The Great Filter hypothesis says we’re alone in the universe because the process of evolution contains some filter that prevents life from spreading into the universe. Have we passed it or is it in our future? Humanity’s survival may depend on the answer. (Original score by Point Lobo.) Interviewees: Robin Hanson, George Mason University economist (creator of the Great Filter hypothesis); Toby Ord, Oxford University philosopher; Donald Brownlee, University of Washington astrobiologist (co-creator of the Rare Earth hypothesis); Phoebe Cohen, Williams College paleontologist. Learn more about your ad-choices at https://news.iheart.com/podcast-advertisers

Nov 07 2018
45 mins
Play

Rank #5: EP03: X Risks

Podcast cover
Read more

Humanity could have a future billions of years long – or we might not make it past the next century. If we have a trip through the Great Filter ahead of us, then we appear to be entering it now. It looks like existential risks will be our filter. (Original score by Point Lobo.) Interviewees: Nick Bostrom, Oxford University philosopher and founder of the Future of Humanity Institute; David Pearce, philosopher and co-founder of the World Transhumanist Association (Humanity+); Robin Hanson, George Mason University economist (creator of the Great Filter hypothesis); Toby Ord, Oxford University philosopher; Sebastian Farquahar, Oxford University philosopher. Learn more about your ad-choices at https://news.iheart.com/podcast-advertisers

Nov 07 2018
41 mins
Play

Rank #6: EP04: Natural Risks

Podcast cover
Read more

Humans have faced existential risks since our species was born. Because we are Earthbound, what happens to Earth happens to us. Josh points out that there’s a lot that can happen to Earth - like gamma ray bursts, supernovae, and runaway greenhouse effect. (Original score by Point Lobo.) Interviewees: Robin Hanson, George Mason University economist (creator of the Great Filter hypothesis); Ian O’Neill, astrophysicist and science writer; Toby Ord, Oxford University philosopher. Learn more about your ad-choices at https://news.iheart.com/podcast-advertisers

Nov 14 2018
39 mins
Play

Rank #7: EP05: Artificial Intelligence

Podcast cover
Read more

An artificial intelligence capable of improving itself runs the risk of growing intelligent beyond any human capacity and outside of our control. Josh explains why a superintelligent AI that we haven’t planned for would be extremely bad for humankind. (Original score by Point Lobo.) Interviewees: Nick Bostrom, Oxford University philosopher and founder of the Future of Humanity Institute; David Pearce, philosopher and co-founder of the World Transhumanist Association (Humanity+); Sebastian Farquahar, Oxford University philosopher. Learn more about your ad-choices at https://news.iheart.com/podcast-advertisers

Nov 16 2018
44 mins
Play

Rank #8: EP06: Biotechnology

Podcast cover
Read more

Natural viruses and bacteria can be deadly enough; the 1918 Spanish Flu killed 50 million people in four months. But risky new research, carried out in an unknown number of labs around the world, are creating even more dangerous humanmade pathogens. (Original score by Point Lobo.) Interviewees: Beth Willis, former chair, Containment Laboratory Community Advisory Committee; Dr Lynn Klotz, senior fellow at the Center for Arms Control and Non-Proliferation. Learn more about your ad-choices at https://news.iheart.com/podcast-advertisers

Nov 21 2018
59 mins
Play

Rank #9: EP07: Physics Experiments

Podcast cover
Read more

Surprisingly the field of particle physics poses a handful of existential threats, not just for us humans, but for everything alive on Earth – and in some cases, the entire universe. Poking around on the frontier of scientific understanding has its risks. (Original score by Point Lobo.) Interviewees: Don Lincoln, Fermi National Laboratory senior experimental particle physicist; Ben Shlaer, University of Auckland cosmologist University of Auckland; Daniel Whiteson, University of California, Irvine astrophysicist; Eric Johnson, University of Oklahoma professor of law  Learn more about your ad-choices at https://news.iheart.com/podcast-advertisers

Nov 23 2018
1 hour 16 mins
Play

Rank #10: EP08: Embracing Catastrophe

Podcast cover
Read more

We humans are our own worst enemies when it comes to what it will take to deal with existential risks. We are loaded with cognitive biases, can’t coordinate on a global scale, and see future generations as freeloaders. Seriously, are we going to survive? (Original score by Point Lobo.)Interviewees: Nick Bostrom, Oxford University philosopher and founder of the Future of Humanity Institute; Toby Ord, Oxford University philosopher; Anders Sandberg, Oxford University philosopher; Sebastian Farquahar, Oxford University philosopher; Eric Johnson, University of Oklahoma professor of law  Learn more about your ad-choices at https://news.iheart.com/podcast-advertisers

Nov 28 2018
49 mins
Play

Similar Podcasts