How to Actually Change Your Mind by Eliezer Yudkowsky | Goodreads
Jump to ratings and reviews
Rate this book

Rationality: From AI to Zombies #2

How to Actually Change Your Mind

Rate this book
“I believe that it is right and proper for me, as a human being, to have an interest in the future, and what human civilization becomes in the future. One of those interests is the human pursuit of truth, which has strengthened slowly over the generations (for there was not always science). I wish to strengthen that pursuit further, in this generation. That is a wish of mine, for the Future. For we are all of us players upon that vast gameboard, whether we accept the responsibility or not. “And that makes your rationality my business. “Is this a dangerous idea? Yes, and not just pleasantly edgy 'dangerous.' People have been burned to death because some priest decided that they didn’t think the way they should. Deciding to burn people to death because they ‘don’t think properly’—that’s a revolting kind of reasoning, isn’t it? You wouldn’t want people to think that way, why, it’s disgusting. People who think like that, well, we’ll have to do something about them...“I agree! Here’s my Let’s argue against bad ideas but not set their bearers on fire.” Human intelligence is a an amazing capacity that has single-handedly put humans in a dominant position on Earth. When human intelligence defeats itself and goes off the rails, the fallout therefore tends to be a uniquely big deal. In How to Actually Change Your Mind , decision theorist Eliezer Yudkowsky asks how we can better identify and sort out our biases, integrate new evidence, and achieve lucidity in our daily lives. Because it really seems as though we should be able to do better— —and a three-pound all-purpose superweapon is a terrible thing to waste.

310 pages, Paperback

First published January 1, 2018

Loading interface...
Loading interface...

About the author

Eliezer Yudkowsky

46 books1,716 followers
From Wikipedia:

Eliezer Shlomo Yudkowsky is an American artificial intelligence researcher concerned with the singularity and an advocate of friendly artificial intelligence, living in Redwood City, California.

Yudkowsky did not attend high school and is an autodidact with no formal education in artificial intelligence. He co-founded the nonprofit Singularity Institute for Artificial Intelligence (SIAI) in 2000 and continues to be employed as a full-time Research Fellow there.

Yudkowsky's research focuses on Artificial Intelligence theory for self-understanding, self-modification, and recursive self-improvement (seed AI); and also on artificial-intelligence architectures and decision theories for stably benevolent motivational structures (Friendly AI, and Coherent Extrapolated Volition in particular). Apart from his research work, Yudkowsky has written explanations of various philosophical topics in non-academic language, particularly on rationality, such as "An Intuitive Explanation of Bayes' Theorem".

Yudkowsky was, along with Robin Hanson, one of the principal contributors to the blog Overcoming Bias sponsored by the Future of Humanity Institute of Oxford University. In early 2009, he helped to found Less Wrong, a "community blog devoted to refining the art of human rationality". The Sequences on Less Wrong, comprising over two years of blog posts on epistemology, Artificial Intelligence, and metaethics, form the single largest bulk of Yudkowsky's writing.

Ratings & Reviews

What do you think?
Rate this book

Friends & Following

Create a free account to discover what your friends think of this book!

Community Reviews

5 stars
117 (48%)
4 stars
88 (36%)
3 stars
30 (12%)
2 stars
2 (<1%)
1 star
4 (1%)
Displaying 1 - 25 of 25 reviews
Profile Image for Gavin.
1,116 reviews418 followers
April 10, 2019
Imagine someone great - I think of Bertrand Russell or Dan Dennett or CS Peirce or Alan Turing - writing really well about actually scientific self-help. Imagine they wrote most days for 2 years, and so distilled decades of trying to find the truth as a heavily biased primate barely out of the trees. Imagine it was empathic and well-justified with argument and experimental data. But imagine it turns out it wasn't a Canonical figure writing it, but instead some guy on the internet with no credentials and weird opinions. But imagine - or rather, I ask you to trust me, til you see for yourself - that the result matches what the greats achieved in the theory of practical reasoning. (Dennett actually wrote a practical-reason how-to book, and it isn't nearly as good.)

These essays are fumbling attempts to put into words lessons better taught by experience. But at least there’s underlying math, plus experimental evidence from cognitive psychology on how humans actually think. Maybe that will be enough to cross the stratospherically high threshold required for a discipline that lets you actually get it right, instead of just constraining you into interesting new mistakes.

This is only one-sixth of Yudkowsky's enormous Sequences - an unusually scientifically accurate philosophical system covering statistics, physics, psychology, history, ethics, and, most importantly, the specific universal obstacles to your being rational. (As a brutal compression, the philosophy can be glossed as radical Bayesian-Quinean evidentialism.) I've read it three times in 10 years, and got more from it each time. Quite a lot of it seemed absurd the first time I read it, for instance his principle of Conservation of Expected Evidence, but I now know it to be mathematically safe.

There are loads of great tools here. Just one example out of dozens: the idea of a pejorative Fully General Counterargument, a good-sounding objection which applies equally to all possible arguments, and which thus tells you nothing about the truth of the matter. Examples

* “Oh he's an 'expert' is he? Experts are systematically miscalibrated
* “My opponent is [just] a clever arguer”
* “That evidence was filtered by a biased person, therefore I can ignore it”
* "There are arguments both for and against”

Along with Kant's Transcendental Analytic, The Great Gatsby (don't ask), and Marfarquhar's Strangers Drowning, it's one of the only books I've ever taken paragraph-by-paragraph notes on.

Free, or by donation to his nonprofit, here.
Profile Image for Tarmo Pungas.
150 reviews6 followers
January 27, 2021
I'm now convinced that changing one's mind is most difficult. If nothing else sticks, I'll try to remember the litanies. "What is true is already so. Owning up to it doesn't make it worse." (Gendlin)
Profile Image for Sai Sasank Y.
9 reviews
June 11, 2021
Changing one's mind seems easy but I've now seen enough counter-evidence. Looking to start with something simple like recognizing semantic stop signs. Thoroughly enjoyed the essays.
Profile Image for Jessy.
255 reviews60 followers
September 7, 2019
Surprisingly, the best part of the book was not the "how to think rationally" part, but the "how to think about thinking rationally." I began thinking rationality is basically synonymous with being a good Bayesian updater (having read some lesswrong posts of this flavor) and expected most of the book to be about recognizing and combatting common biases. This does occupy sections "Seeing with Fresh Eyes" and "Death Spirals", but I came away from my favorite sections realizing that there's a lot to be learned here about thinking clearly in general. Yudkowsky framed a lot of common excuses/pitfalls/modes of thinking in interesting ways that made me recognize my own habitually flawed thought patterns - e.g. being personally attached to beliefs, using humility as an excuse not to pursue complicated questions further, etc.

Some chapters are skippable (and designed to be that way; all of them are basically blog posts on lesswrong sequenced in a particular way), but overall there are a lot of great observations about thinking that are worth a read.
Profile Image for Dao Le.
116 reviews15 followers
July 31, 2020
I was fascinated with an idea of objective truths ever since I took Formal Logic in college. I prided myself on the ability to determine if an argument is valid, true, or both. But then when I graduated, the real world is not as rational as I thought, and that deeply troubled me. The faults in our thinking (myself included) are much more complex and harder to identify, especially when one takes into account our many heuristics (think shortcuts) that are wired as default into our brains. Until this book comes along. Despite the unfortunate title (he's a rationalist, not a novelist after all), this book really does help me to actually change my mind, or at least try. To quote Eugene Gendlin: "What is true is already so. Owning up to it doesn't make it worse. Not being open about it doesn't make it go away. And because it's true, it is what is there to be interacted with."
Profile Image for James.
101 reviews
May 7, 2024
Didn't feel quite as interesting as the first book. The title suggests something more concrete, but the content is at the same level of abstraction. Did provide some really interesting food for thought about the foundations of rationality - bootstrapping up belief systems, the empirical nature of logical axioms, Principle of the Uniformity of Nature, etc.

Notes:
• The Sophisticate: “The world isn’t black and white. No one does pure good or pure bad. It’s all gray. Therefore, no one is better than anyone else.”
The Zetet: “Knowing only gray, you conclude that all grays are the same shade. You mock the simplicity of the two-color view, yet you replace it with a one-color view . . .”
• "Politics is the mind-killer"
• "But studies such as the above show that people tend to judge technologies—and many other problems—by an overall good or bad feeling. If you tell people a reactor design produces less waste, they rate its probability of meltdown as lower"
• Students guessed 1x2x3x4x5x6x7x8=512, but 8x7x6x5x4x3x2x1=2250
• Anchoring is suspected to be due to priming, or "contamination" as it's called at the cognitive level
• "cognitive business", distraction, interruption seem to interfere with our ability to recognize and reject falsehoods, but not to accept truths. Proposed conclusion: we accept ideas as true be default, and it takes effort to reject
• Affect heuristic: people tend to judge things as overall "good" or "bad", even when various aspects aren't necessarily related like that
○ Presenting information about high benefits made people perceive lower risks; presenting information about higher risks made people perceive lower benefits
○ Exacerbated by time pressure
○ Causes halo effect
• Affective death spirals
○ Similar to what I called "building a philosophical system", where one idea lies at the crux of every issue and could solve every problem. Taleb with black swans, Walker with sleep, possibly me with incentive alignment
○ Probably the single most reliable sign of a cult guru is that the guru claims expertise, not in one area, not even in a cluster of related areas, but in everything.
• "Cognitive dissonance" explanation of increased backfire effect seems weak. Cognitive dissonance seems either like a mysterious answer, or just a term for "threat to established beliefs". In either case, the question of why views get stronger in response to dissonance instead of weaker still stands.
• EY's explanation is that the counter-evidence pushes people closer to leaving, the ones that go over the threshold leave, and the more extreme ones stay. But if this is like pushing a queue over a cliff, then the average should still decrease - it's as if the most extreme disappeared, since the least extreme disappear but everyone else shifts to doubting more. The only way this would lead to an increase in fanaticism is if the people that don't change their minds don’t even shift towards doubt (or do so very little), or if the fanaticism is distributed super-linearly in the first place.
• It's important to be tolerant of criticism to yourself because if you eject, the group you left will shift away from you. If everyone ejects early, groups will tend to get more extreme very quickly.
○ Although I guess they'll also shrink very quickly. If everyone ejects at criticism, groups just collapse to the most extreme member shouting at nobody.
○ Makes sense given echo chamber intuition. Desire to avoid criticism is what leads to bubbles
• EY's "hate death spirals" are what I've been calling signaling races. It's where everybody wants to get their signaling and say how bad the enemy is, and the discussion (negative halo effect) doesn't allow consideration of the other side.
○ 9/11 hyper-patriotism reaction
○ Excessive political correctness/cancel culture
○ Red Scare
• To be one more milestone in humanity’s road is the best that can be said of anyone; but this seemed too lowly to please Ayn Rand. And that is how she became a mere Ultimate Prophet.
• The presence of a single dissenter dramatically decreases conformity bias, but only while they are dissenting. Immunity to confirmation bias doesn't seem to be learned from past dissenters paving the way.
• Careful about self-image as a rationalist. Can lead to motivated investigation - thinking hard enough to satisfy your self-image, but not actually optimal
• The principles of rationality are the same kind of laws as the Laws of Thermodynamics
○ Ong says writing enables syllogisms. But if logic is learned, not somehow "unlocked", how does this work?
• For ideas that you are attached to, consider what would happen if they were false, simply as a hypothetical. Once this "retreat" is planned out and mapped, hopefully the fear of abandoning the idea if it does turn out to be false will be reduced.
Profile Image for Arjo.
82 reviews
September 3, 2021
This is Book 2 of Rationality: From AI to Zombies, a series of blog posts by Eliezer Yudkowsky on human rationality and irrationality in cognitive science. Book 2 discusses motivated reasoning and confirmation bias. Why do we keep jumping to conclusions, digging our heels in and recapitulating the same mistakes? Why are we so bad at acquiring accurate beliefs and how can we do better? *

The book addressed these topics brilliantly in my opinion. Every chapter broke down how we usually think and the heuristics and biases we so naturally fall back on. This is great and all but I think the problem, first and foremost, is why should anyone care if their thinking is flawed and that their beliefs are wrong. Why care about truth?

In one of the chapters, Yudkowsky answers - "I believe that it is right and proper for me, as a human being, to have an interest in the future, and what human civilization becomes in the future. One of those interests is the human pursuit of truth [...] For we are all of us players upon that vast gameboard, whether we accept the responsibility or not. And that makes your rationality my business."

Indeed we need to get our head on straight if we want to shape the future in a way that is good for all of us. But how do you teach people to care beyond their own selves? There has been an emphasis on individualistic pursuits recently and I wholeheartedly support this. But I think ultimately, the goal of individuation is the realization that one is part of a whole, to see where one fits in a group. It starts and ends in collectivism. This, among other things, is unteachable.

Truth is a good thing. Maps should reflect territories. Beliefs require evidence. And if you are curious about/questioning this, you can read the entire collection at readthesequences.com where you can also just pick topics that pique your interest. There is also a website called lesswrong.com which aims to understand the world and be less and less wrong about it each day.

*excerpt from lesswrong.com
Profile Image for endofsilence.
18 reviews14 followers
January 9, 2020
This is a rock and roll guide to BayesCraft and clear thinking. Entertaining, witty and illuminating. A must read for those wanting to be able to see and understand the world a bit more accurately.

''If in your heart you believe you already know, or if in your heart you do not wish to know, then your questioning will be purposeless and your skills without direction. Curiosity seeks to annihilate itself; there is no curiosity that does not want an answer''

''You shouldn’t be afraid to just visualize a world you fear. If that world is already actual, visualizing it won’t make it worse; and if it is not actual, visualizing it will do no harm.
What is true is already so; owning up to it doesn’t make it worse.''

''That which can be destroyed by the truth should be.
People can stand what is true, for they are already enduring it.
Curiosity seeks to annihilate itself.''

''Not every change is an improvement, but every improvement is necessarily a change. If we only admit small local errors, we will only make small local changes. The motivation for a big change comes from acknowledging a big mistake.''

''A doubt that neither destroys itself nor destroys its target might as well have never existed at all. It is the resolution of doubts, not the mere act of doubting, which drives the ratchet of rationality forward.''
Profile Image for Chris Boutté.
Author 9 books217 followers
September 6, 2021
This is without a doubt one of my new favorite books when it comes to decision making and being more rational. We all love to think we’re rational, but we’re wired to be irrational, and the first step towards working on it is to recognize it. I love learning about becoming a better decision maker and being more rational, which is why I’m so glad I came across this book from Eliezer Yudkowsky. I hadn’t heard of his work until I saw Julia Galef (another great voice in this realm) tweet one of the essays from Eliezer’s website LessWrong.com.

Eliezer is a rationalist, and while there are some arguments that none of us should be completely rational, Eliezer makes excellent arguments for how we should be “less wrong”. This book is a collection of essays, and I loved 90% of them. Part of being rational is having the ability to change your mind and update your beliefs. Throughout the essays, you can see Eliezer’s intellectual humility, and he has no problem sharing personal stories of when he was wrong and the lessons he’s learned from it. So, if you want to be less wrong, read this book, and I can definitely see myself reading it again in the future to freshen up my skills in being rational.
Profile Image for Chris Durston.
Author 18 books30 followers
August 11, 2020
This is a collection of many very short... micro-essays, I guess, each of which looks closely at a specific bias, problem, proof, or other interesting titbit in the sphere of rationality and how to be more rational.

Because each essay is so short (sometimes only three or four pages when read through the Kindle app on an iPhone), the collection lends itself to being read in fits and starts, being dropped and returned to as needed. It's not the sort of thing you'd probably power through in one go, although it's sometimes beneficial to at least try to read several parts at once because each will often be related to its neighbours.

Yudkowsky explains most concepts in a way that almost anyone could understand, although you might occasionally find yourself feeling reluctant to listen to him purely because the entire premise of his work is that he knows better than you and that can be a bit annoying. Still, that's kind of the point: it's worth undertaking the endeavour of improving one's own critical thinking so that one can in fact come to know better.
46 reviews4 followers
September 25, 2022
A lot to think about and remember (for example the piece about political arguments, priming/contamination/cached thoughts and conformity). But also quite a bit of patronising and at times even disdainful attitude towards the "unenlightened" (people not employing rational methods), especially in the beginning of the book.
181 reviews
January 12, 2020
'How to Actually Change Your Mind', the second in a series by Eliezer Yudkowsky, ambitiously seeks to deepen the reader's understanding of rationality and teach 'rationality as a martial art'.

It is successful, at least for me, in causing reflection upon one's perspective on institutions and human nature, on encouraging the reader to hold himself or herself accountable for irrational patterns of thinking, and teaching red flags to watch for in order to find gaps or errors in reasoning.

There is, however, a little effort required; Yudkowsky can be a little self-important sometimes, and gets a bit self-indulgent. Despite this, the book is highly recommended as a primer on clear thinking.
Profile Image for Gianluca Truda.
103 reviews
March 25, 2019
This second instalment of Yudkowsky's Sequences is a lot more practical than the first. Where Map and Territory focussed on the fundamentals of cognitive biases, models, and beliefs, this collection presents heuristics for avoiding pitfalls, integrating evidence, and updating your beliefs like a good rationalist.

Whilst an incredibly valuable piece of work, it still loses a star for its inaccessibility to the uninitiated and being (at times) deliberately confusing. It's also really long.

Sections:
E. Overly-Convenient Excuses
F. Politics and Rationality
G. Against Rationalization
H. Against Doublethink
I. Seeing with Fresh Eyes
J. Death Spirals
K. Letting Go
Profile Image for Sunil Pandey.
16 reviews
February 9, 2021
Brilliant, Pushes the envelope and thinks through where you actually stop thinking.
lines like "Rationality is not about winning debates, it is for deciding which side to join"
and conclusion like "distraction only impaired one’s ability to detect false statements but had no effect on one’s ability to detect true statements. "

are plain insightful-
Turbid, mind-bending but must read!
Profile Image for Blue.
38 reviews
January 31, 2022
Great book. You need to read the first book to understand this one well. It had many great concepts that were ingrained into me. It had many ideas I never thought of, and many ideas which seemed wrong first but then made complete sense. The death spiral chapter had an article which was mostly a repetition, but everything else was amazing.
Profile Image for Julissa Dantes-castillo.
367 reviews26 followers
September 7, 2020
Even if the book is divided in sections, I find it lacks structure, reading each chapter rarely was a building idea but a bunch of ideas thrown together and going back to a previous idea occasionally.
Profile Image for Ben Brooks.
74 reviews1 follower
January 10, 2021
I thoroughly enjoyed it. ~70 essays from a rationalist on forming beliefs and making decisions. In-depth discussion of bias and fallacy.
Profile Image for Alfie.
15 reviews1 follower
April 26, 2024
That which can be destroyed by the truth should be. People can stand what is true, for they are already enduring it. Curiosity seeks to annihilate itself.
Profile Image for J_BlueFlower.
685 reviews8 followers
August 1, 2021
There is so much good stuff in here, but the over all structure is very lacking. It could have been so much better. The title is just awesome and true to the content.

Profile Image for Luke Eure.
174 reviews
December 16, 2021
Has made me better at thinking. What more can you want from a book?

After reading this book I have a much stronger desire to be correct about the things I say and believe, rather than to be comfortable

Biggest rules of thumb I learned:

A perfectly rational mind would not be afraid of changing its beliefs

Visualize what you would do in a world where your most cherished belief might be wrong. This helps make you open to changing your mind, if it turns out to be the fact that your most cherished belief is wrong.

People really don’t like to change their minds. Like really

My judgements about things often come from a vague positive or negative feeling about them (i.e., “mood affiliation”) rather than from actually examining and tallying up the facts

You tend to passively, subconsciously believe what you read, and then your mind subconsciously goes back and disbelieves it if it seems fishy. Which is a reason to avoid unreliable information

Spend 5 minutes with your eyes closed thinking of reasonable alternatives when making medium or decisions

Most importantly: Ask yourself often “What is the core reason I have this belief? If XX changed, would I still believe this?”
Profile Image for Niklas.
43 reviews
March 29, 2021
This book (the whole series, really) offers a great meta-approach to thinking more clearly, logically and just less wrong.

While the first book, Map And Territory, started with explaining common failure modes of the human mind, this second book outlines various pitfalls one (or a group of people) has to avoid in order to become more rational.
Displaying 1 - 25 of 25 reviews

Can't find what you're looking for?

Get help and learn more about the design.