Consistently Candid copertina

Consistently Candid

Consistently Candid

Di: Sarah Hastings-Woodhouse
Ascolta gratuitamente

3 mesi a soli 0,99 €/mese

Dopo 3 mesi, 9,99 €/mese. Si applicano termini e condizioni.

A proposito di questo titolo

AI safety, philosophy and other things.© 2025 Consistently Candid Filosofia Scienze sociali
  • #20 Frances Lorenz on the emotional side of AI x-risk, being a woman in a male-dominated online space & more
    May 14 2025

    In this episode, I chatted with Frances Lorenz, events associate at the Centre for Effective Altruism. We covered our respective paths into AI safety, the emotional impact of learning about x-risk, what it's like to be female in a male-dominated community and more!

    Follow Frances on Twitter

    Subscribe to her Substack

    Apply for EAG London!

    Mostra di più Mostra meno
    52 min
  • #19 Gabe Alfour on why AI alignment is hard, what it would mean to solve it & what ordinary people can do about existential risk
    Apr 13 2025

    Gabe Alfour is a co-founder of Conjecture and an advisor to Control AI, both organisations working to reduce risks from advanced AI.

    We discussed why AI poses an existential risk to humanity, what makes this problem very hard to solve, why Gabe believes we need to prevent the development of superintelligence for at least the next two decades, and more.

    Follow Gabe on Twitter

    Read The Compendium and A Narrow Path





    Mostra di più Mostra meno
    1 ora e 37 min
  • #18 Nathan Labenz on reinforcement learning, reasoning models, emergent misalignment & more
    Mar 2 2025

    A lot has happened in AI since the last time I spoke to Nathan Labenz of The Cognitive Revolution, so I invited him back on for a whistlestop tour of the most important developments we've seen over the last year!

    We covered reasoning models, DeepSeek, the many spooky alignment failures we've observed in the last few months & much more!

    Follow Nathan on Twitter

    Listen to The Cognitive Revolution

    My Twitter & Substack

    Mostra di più Mostra meno
    1 ora e 46 min
Ancora nessuna recensione