Menu
Sign In Search Podcasts Charts Entities Add Podcast API Pricing
Podcast Image

2nd Order Thinkers.

All You Need For AI Risks.

13 Sep 2025

Description

✉️ Stay Updated With 2nd Order Thinkers: https://www.2ndorderthinkers.com/I translate the latest AI research into plain English and answer your most challenging questions so you can build your own informed view on AI.+++Executives keep saying they “understand AI risk.” The evidence disagrees. This episode is the antidote: a plain-English tour of MIT’s AI Risk Repository—a living map of 1,600+ failure modes across 65 frameworks—so you stop guessing and start checking.In this episode, we:Decode MIT’s two-part taxonomy (who caused the harm, whether it was intentional, and when it appears) and why most failures surface after deploymentTurn the chaos of “65 frameworks” into one usable language for leaders, not vendorsWalk through 2025 failures you’ll actually recognize (healthcare models missing critical deterioration; AI-scaled extortion and employment scams)Map a pragmatic playbook: pick the one domain that could sink you, shortlist five visible and expensive risks, and write the narrative that gets your team and board to act📖 Read the full article here: https://www.2ndorderthinkers.com/p/ai-risk-isnt-a-tech-problem-but-a👍 If you enjoyed this episode:Like & Subscribe to get future deep dives without the hypeComment: What’s the one AI risk that could actually hurt your org next quarter?Share it with the person whose reputation depends on AI working🔗 Connect with me on LinkedIn: https://www.linkedin.com/in/jing--hu/Stay curious, stay skeptical 🧠 This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit www.2ndorderthinkers.com/subscribe

Audio
Featured in this Episode

No persons identified in this episode.

Transcription

No transcription available yet

Help us prioritize this episode for transcription by upvoting it.

0 upvotes
🗳️ Sign in to Upvote

Popular episodes get transcribed faster

Comments

There are no comments yet.

Please log in to write the first comment.