Eliezer Yudkowsky is a researcher, writer, and philosopher on the topic of superintelligent AI. Please support this podcast by checking out our sponsors:- Linode: https://linode.com/lex to get $100 free credit- House of Macadamias: https://houseofmacadamias.com/lex and use code LEX to get 20% off your first order- InsideTracker: https://insidetracker.com/lex to get 20% off
EPISODE LINKS:Eliezer's Twitter: https://twitter.com/ESYudkowskyLessWrong Blog: https://lesswrong.comEliezer's Blog page: https://www.lesswrong.com/users/eliezer_yudkowskyBooks and resources mentioned:1. AGI Ruin (blog post): https://lesswrong.com/posts/uMQ3cqWDPHhjtiesc/agi-ruin-a-list-of-lethalities2. Adaptation and Natural Selection: https://amzn.to/40F5gfa
PODCAST INFO:Podcast website: https://lexfridman.com/podcastApple Podcasts: https://apple.co/2lwqZIrSpotify: https://spoti.fi/2nEwCF8RSS: https://lexfridman.com/feed/podcast/YouTube Full Episodes: https://youtube.com/lexfridmanYouTube Clips: https://youtube.com/lexclips
SUPPORT & CONNECT:- Check out the sponsors above, it's the best way to support this podcast- Support on Patreon: https://www.patreon.com/lexfridman- Twitter: https://twitter.com/lexfridman- Instagram: https://www.instagram.com/lexfridman- LinkedIn: https://www.linkedin.com/in/lexfridman- Facebook: https://www.facebook.com/lexfridman- Medium: https://medium.com/@lexfridman
OUTLINE:Here's the timestamps for the episode. On some podcast players you should be able to click the timestamp to jump to that time.(00:00) - Introduction(05:19) - GPT-4(28:00) - Open sourcing GPT-4(44:18) - Defining AGI(52:14) - AGI alignment(1:35:06) - How AGI may kill us(2:27:27) - Superintelligence(2:34:39) - Evolution(2:41:09) - Consciousness(2:51:41) - Aliens(2:57:12) - AGI Timeline(3:05:11) - Ego(3:11:03) - Advice for young people(3:16:21) - Mortality(3:18:02) - Love
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More