According to this article on Less Wrong (There's good stuff on this website), rationality can be defined in two ways:
Epistemic rationality: believing, and updating on evidence, so as to systematically improve the correspondence between your map and the territory. The art of obtaining beliefs that correspond to reality as closely as possible. This correspondence is commonly termed "truth" or "accuracy", and we're happy to call it that.
Instrumental rationality: achieving your values. Not necessarily "your values" in the sense of being selfishvalues or unshared values: "your values" means anything you care about. The art of choosing actions that steer the future toward outcomes ranked higher in your preferences. On LW we sometimes refer to this as "winning".
If that seems like a perfectly good definition, you can stop reading here; otherwise continue.
Sometimes experimental psychologists uncover human reasoning that seems very strange - for example, someone rates the probability "Bill plays jazz" as less than the probability "Bill is an accountant who plays jazz". This seems like an odd judgment, since any particular jazz-playing accountant is obviously a jazz player. But to what higher vantage point do we appeal in saying that the judgment is wrong?
Experimental psychologists use two gold standards: probability theory, and decision theory. Since it is a universal law of probability theory that P(A) ≥ P(A & B), the judgment P("Bill plays jazz") < P("Bill plays jazz" & "Bill is accountant") is labeled incorrect.
To keep it technical, you would say that this probability judgment is non-Bayesian. Beliefs that conform to a coherent probability distribution, and decisions that maximize the probabilistic expectation of a coherent utility function, are called "Bayesian".
This does not quite exhaust the problem of what is meant in practice by "rationality", for two major reasons:
First, the Bayesian formalisms in their full form are computationally intractable on most real-world problems. No one canactually calculate and obey the math, any more than you can predict the stock market by calculating the movements of quarks. This is why we have a whole site called "Less Wrong", rather than simply stating the formal axioms and being done. There's a whole further art to finding the truth and accomplishing value from inside a human mind: we have to learn our own flaws, overcome our biases, prevent ourselves from self-deceiving, get ourselves into good emotional shape to confront the truth and do what needs doing, etcetera etcetera and so on.
Second, sometimes the meaning of the math itself is called into question. The exact rules of probability theory are called into question by e.g. anthropic problems in which the number of observers is uncertain. The exact rules of decision theory are called into question by e.g. Newcomblike problems in which other agents may predict your decision before it happens.
In cases like these, it is futile to try...(more)Loading...
It was a car accident. Nothing particularly remarkable, but fatal nonetheless. You left behind a wife and two children. It was a painless death. The EMTs tried their best to save you, but to no avail. Your body was so utterly shattered you were better off, trust me.
And that’s when you met me. “What… what happened?” You asked. “Where am I?”
“You died,” I said, matter-of-factly. No point in mincing words.
“There was a… a truck and it was skidding…”
“Yup,” I said.
“I… I died?”
“Yup. But don’t feel bad about it. Everyone dies,” I said.
You looked around. There was nothingness. Just you and me. “What is this place?” You asked. “Is this the afterlife?”