Top 113 Quotes & Sayings by Eliezer Yudkowsky

Explore popular quotes and sayings by an American writer Eliezer Yudkowsky.
Last updated on November 21, 2024.
Eliezer Yudkowsky

Eliezer Shlomo Yudkowsky is an American decision and artificial intelligence (AI) theorist and writer, best known for popularizing the idea of friendly artificial intelligence. He is a co-founder and research fellow at the Machine Intelligence Research Institute (MIRI), a private research nonprofit based in Berkeley, California. His work on the prospect of a runaway intelligence explosion was an influence on Nick Bostrom's Superintelligence: Paths, Dangers, Strategies.

We tend to see individual differences instead of human universals. Thus, when someone says the word 'intelligence,' we think of Einstein instead of humans.
The human species was not born into a market economy. Bees won't sell you honey if you offer them an electronic funds transfer. The human species imagined money into existence, and it exists - for us, not mice or wasps - because we go on believing in it.
Anything that could give rise to smarter-than-human intelligence - in the form of Artificial Intelligence, brain-computer interfaces, or neuroscience-based human intelligence enhancement - wins hands down beyond contest as doing the most to change the world. Nothing else is even in the same league.
My successes already accomplished have mostly been taking existing science and getting people to apply it in their everyday lives. — © Eliezer Yudkowsky
My successes already accomplished have mostly been taking existing science and getting people to apply it in their everyday lives.
My parents were early adopters, and I've been online since a rather young age. You should regard anything from 2001 or earlier as having been written by a different person who also happens to be named 'Eliezer Yudkowsky.' I do not share his opinions.
I want to carry in my heart forever the key word of the Olympics - 'passion.'
There's a popular concept of 'intelligence' as book smarts, like calculus or chess, as opposed to, say, social skills. So people say that 'it takes more than intelligence to succeed in human society.' But social skills reside in the brain, not the kidneys.
Let the winds of evidence blow you about as though you are a leaf, with no direction of your own. Beware lest you fight a rearguard retreat against the evidence, grudgingly conceding each foot of ground only when forced, feeling cheated. Surrender to the truth as quickly as you can.
Transhumanists are not fond of death. We would stop it if we could. To this end, we support research that holds out hope of a future in which humanity has defeated death.
I don't care where I live, so long as there's a roof to keep the rain off my books, and high-speed Internet access.
In our skulls, we carry around 3 pounds of slimy, wet, greyish tissue, corrugated like crumpled toilet paper. You wouldn't think, to look at the unappetizing lump, that it was some of the most powerful stuff in the known universe.
I keep trying to explain to people that the archetype of intelligence is not Dustin Hoffman in 'The Rain Man;' it is a human being, period. It is squishy things that explode in a vacuum, leaving footprints on their moon.
I wouldn't be surprised if tomorrow was the Final Dawn, the last sunrise before the Earth and Sun are reshaped into computing elements.
Though I have friends aplenty in academia, I don't operate within the academic system myself.
I am a full-time Research Fellow at the Machine Intelligence Research Institute, a small 501(c)(3) public charity supported primarily by individual donations. — © Eliezer Yudkowsky
I am a full-time Research Fellow at the Machine Intelligence Research Institute, a small 501(c)(3) public charity supported primarily by individual donations.
When something is universal enough in our everyday lives, we take it for granted to the point of forgetting it exists.
A burning itch to know is higher than a solemn vow to pursue truth. To feel the burning itch of curiosity requires both that you be ignorant, and that you desire to relinquish your ignorance.
When you think of intelligence, don't think of a college professor; think of human beings as opposed to chimpanzees. If you don't have human intelligence, you're not even in the game.
To be clever in argument is not rationality but rationalization.
If our extinction proceeds slowly enough to allow a moment of horrified realization, the doers of the deed will likely be quite taken aback on realizing that they have actually destroyed the world. Therefore I suggest that if the Earth is destroyed, it will probably be by mistake.
The purest case of an intelligence explosion would be an Artificial Intelligence rewriting its own source code. The key idea is that if you can improve intelligence even a little, the process accelerates. It's a tipping point. Like trying to balance a pen on one end - as soon as it tilts even a little, it quickly falls the rest of the way.
Textbook science is beautiful! Textbook science is comprehensible, unlike mere fascinating words that can never be truly beautiful. Elementary science textbooks describe simple theories, and simplicity is the core of scientific beauty. Fascinating words have no power, nor yet any meaning, without the math.
Nothing you'll read as breaking news will ever hold a candle to the sheer beauty of settled science. Textbook science has carefully phrased explanations for new students, math derived step by step, plenty of experiments as illustration, and test problems.
A scientist worthy of a lab coat should be able to make original discoveries while wearing a clown suit, or give a lecture in a high squeaky voice from inhaling helium. It is written nowhere in the math of probability theory that one may have no fun.
The systematic experimental study of reproducible errors of human reasoning, and what these errors reveal about underlying mental processes, is known as the heuristics and biases program in cognitive psychology. This program has made discoveries highly relevant to assessors of global catastrophic risks.
The media thinks that only the cutting edge of science, the very latest controversies, are worth reporting on. How often do you see headlines like 'General Relativity still governing planetary orbits' or 'Phlogiston theory remains false'? By the time anything is solid science, it is no longer a breaking headline.
Intelligence is the source of technology. If we can use technology to improve intelligence, that closes the loop and potentially creates a positive feedback cycle.
You cannot 'rationalize' what is not rational to begin with - as if lying were called 'truthization.' There is no way to obtain more truth for a proposition by bribery, flattery, or the most passionate argument - you can make more people believe the proposition, but you cannot make it more true.
Do not flinch from experiences that might destroy your beliefs. The thought you cannot think controls you more than thoughts you speak aloud. Submit yourself to ordeals and test yourself in fire. Relinquish the emotion which rests upon a mistaken belief, and seek to feel fully that emotion which fits the facts.
By far, the greatest danger of Artificial Intelligence is that people conclude too early that they understand it.
The purpose of a moral philosophy is not to look delightfully strange and counterintuitive or to provide employment to bioethicists. The purpose is to guide our choices toward life, health, beauty, happiness, fun, laughter, challenge, and learning.
Since the rise of Homo sapiens, human beings have been the smartest minds around. But very shortly - on a historical scale, that is - we can expect technology to break the upper bound on intelligence that has held for the last few tens of thousands of years.
If I could create a world where people lived forever, or at the very least a few billion years, I would do so. I don't think humanity will always be stuck in the awkward stage we now occupy, when we are smart enough to create enormous problems for ourselves, but not quite smart enough to solve them.
If you want to maximize your expected utility, you try to save the world and the future of intergalactic civilization instead of donating your money to the society for curing rare diseases and cute puppies.
The obvious choice isn't always the best choice, but sometimes, by golly, it is. I don't stop looking as soon I find an obvious answer, but if I go on looking, and the obvious-seeming answer still seems obvious, I don't feel guilty about keeping it.
An anthropologist will not excitedly report of a newly discovered tribe: 'They eat food! They breathe air! They use tools! They tell each other stories!' We humans forget how alike we are, living in a world that only reminds us of our differences.
There are no surprising facts, only models that are surprised by facts; and if a model is surprised by the facts, it is no credit to that model.
Your strength as a rationalist is your ability to be more confused by fiction than by reality. If you are equally good at explaining any outcome, you have zero knowledge.
If dragons were common, and you could look at one in the zoo - but zebras were a rare legendary creature that had finally been decided to be mythical - then there's a certain sort of person who would ignore dragons, who would never bother to look at dragons, and chase after rumors of zebras. The grass is always greener on the other side of reality. Which is rather setting ourselves up for eternal disappointment, eh? If we cannot take joy in the merely real, our lives shall be empty indeed.
Many have stood their ground and faced the darkness when it comes for them. Fewer come for the darkness and force it to face them. — © Eliezer Yudkowsky
Many have stood their ground and faced the darkness when it comes for them. Fewer come for the darkness and force it to face them.
If people got hit on the head by a baseball bat every week, pretty soon they would invent reasons why getting hit on the head with a baseball bat was a good thing.
By far the greatest danger of Artificial Intelligence is that people conclude too early that they understand it.
You will find ambiguity a great ally on your road to power. Give a sign of Slytherin on one day, and contradict it with a sign of Gryffindor the next; and the Slytherins will be enabled to believe what they wish, while the Gryffindors argue themselves into supporting you as well. So long as there is uncertainty, people can believe whatever seems to be to their own advantage. And so long as you appear strong, so long as you appear to be winning, their instincts will tell them that their advantage lies with you. Walk always in the shadow, and light and darkness both will follow.
When you are older, you will learn that the first and foremost thing which any ordinary person does is nothing.
To confess your fallibility and then do nothing about it is not humble; it is boasting of your modesty.
Through rationality we shall become awesome, and invent and test systematic methods for making people awesome, and plot to optimize everything in sight, and the more fun we have the more people will want to join us.
Between hindsight bias, fake causality, positive bias, anchoring/priming, et cetera et cetera, and above all the dreaded confirmation bias, once an idea gets into your head, it's probably going to stay there.
You are personally responsible for becoming more ethical than the society you grew up in.
My experience is that journalists report on the nearest-cliche algorithm, which is extremely uninformative because there aren't many cliches, the truth is often quite distant from any cliche, and the only thing you can infer about the actual event was that this was the closest cliche. It is simply not possible to appreciate the sheer awfulness of mainstream media reporting until someone has actually reported on you. It is so much worse than you think.
Reality has been around since long before you showed up. Don't go calling it nasty names like 'bizarre' or 'incredible'. The universe was propagating complex amplitudes through configuration space for ten billion years before life ever emerged on Earth. Quantum physics is not 'weird'. You are weird.
Trying and getting hurt can't possibly be worse for you than being... stuck. — © Eliezer Yudkowsky
Trying and getting hurt can't possibly be worse for you than being... stuck.
I ask the fundamental question of rationality: Why do you believe what you believe? What do you think you know and how do you think you know it?
It is triple ultra forbidden to respond to criticism with violence. There are a very few injunctions in the human art of rationality that have no ifs, ands, buts, or escape clauses. This is one of them. Bad argument gets counterargument. Does not get bullet. Never. Never ever never for ever.
Litmus test: If you can't describe Ricardo's Law of Comparative Advantage and explain why people find it counterintuitive, you don't know enough about economics to direct any criticism or praise at "capitalism" because you don't know what other people are referring to when they use that word.
Remember, if you succeed in everything you try in life, you're living below your full potential and you should take up more difficult or daring things.
There is no justice in the laws of nature, no term for fairness in the equations of motion. The Universe is neither evil, nor good, it simply does not care. The stars don't care, or the Sun, or the sky. But they don't have to! WE care! There IS light in the world, and it is US!
Crocker's Rules didn't give you the right to say anything offensive, but other people could say potentially offensive things to you, and it was your responsibility not to be offended. This was surprisingly hard to explain to people; many people would read the careful explanation and hear, "Crocker's Rules mean you can say offensive things to other people."
Moore's Law of Mad Science: Every eighteen months, the minimum IQ necessary to destroy the world drops by one point.
The police officer who puts their life on the line with no superpowers, no X-Ray vision, no super-strength, no ability to fly, and above all no invulnerability to bullets, reveals far greater virtue than Superman - who is only a mere superhero.
- Every time someone cries out in prayer and I can't answer, I feel guilty about not being God. - That doesn't sound good. - I understand that I have a problem, and I know what I need to do to solve it, all right? I'm working on it. Of course, Harry hadn't said what the solution was. The solution, obviously, was to hurry up and become God.
This site uses cookies to ensure you get the best experience. More info...
Got it!