Loading AI tools
American logician From Wikipedia, the free encyclopedia
Richard Carl Jeffrey (August 5, 1926 – November 9, 2002) was an American philosopher, logician, and probability theorist. He is best known for developing and championing the philosophy of radical probabilism and the associated heuristic of probability kinematics, also known as Jeffrey conditioning.
Richard C. Jeffrey | |
---|---|
Born | August 5, 1926 |
Died | November 9, 2002 |
Alma mater | Princeton University |
Era | 20th-century philosophy |
Region | Western philosophy |
School | Analytic philosophy |
Main interests | Decision theory, epistemology |
Notable ideas | Radical probabilism, Jeffrey conditioning, truth tree method for syllogism testing[1] |
Born in Boston, Massachusetts, Jeffrey served in the U.S. Navy during World War II. As a graduate student he studied under Rudolf Carnap and Carl Hempel.[2] He received his M.A. from the University of Chicago in 1952 and his Ph.D. from Princeton in 1957. After holding academic positions at MIT, City College of New York, Stanford University, and the University of Pennsylvania, he joined the faculty of Princeton in 1974 and became a professor emeritus there in 1999. He was also a visiting professor at the University of California, Irvine.[3]
Jeffrey, who died of lung cancer at the age of 76, was known for his sense of humor, which often came through in his breezy writing style. In the preface of his posthumously published Subjective Probability, he refers to himself as "a fond foolish old fart dying of a surfeit of Pall Malls".[4]
As a philosopher, Jeffrey specialized in epistemology and decision theory. He is perhaps best known for defending and developing the Bayesian approach to probability.
Jeffrey also wrote, or co-wrote, two widely used and influential logic textbooks: Formal Logic: Its Scope and Limits, a basic introduction to logic, and Computability and Logic, a more advanced text dealing with, among other things, the famous negative results of twentieth-century logic such as Gödel's incompleteness theorems and Tarski's indefinability theorem.
In Bayesian statistics, Bayes' theorem provides a useful rule for updating a probability when new frequency data becomes available. In Bayesian statistics, the theorem itself plays a more limited role. Bayes' theorem connects probabilities that are held simultaneously. It does not tell the learner how to update probabilities when new evidence becomes available over time. This subtlety was first pointed out in terms by Ian Hacking in 1967.[5]
However, adapting Bayes' theorem, and adopting it as a rule of updating, is a temptation. Suppose that a learner forms probabilities Pold(A&B)=p and Pold(B)=q. If the learner subsequently learns that B is true, nothing in the axioms of probability or the results derived therefrom tells him how to behave. He might be tempted to adopt Bayes' theorem by analogy and set his Pnew(A) = Pold(A | B) = p/q.
In fact, that step, Bayes' rule of updating, can be justified, as necessary and sufficient, through a dynamic Dutch book argument that is additional to the arguments used to justify the axioms. This argument was first put forward by David Lewis in the 1970s though he never published it.[6]
That works when the new data is certain. C. I. Lewis had argued that "If anything is to be probable then something must be certain".[7] There must, on Lewis' account, be some certain facts on which probabilities were conditioned. However, the principle known as Cromwell's rule declares that nothing, apart from a logical law, can ever be certain, if that. Jeffrey famously rejected Lewis' dictum and quipped, "It's probabilities all the way down." He called this position radical probabilism.
In this case Bayes' rule isn't able to capture a mere subjective change in the probability of some critical fact. The new evidence may not have been anticipated or even be capable of being articulated after the event. It seems reasonable, as a starting position, to adopt the law of total probability and extend it to updating in much the same way as was Bayes' theorem.[8]
Adopting such a rule is sufficient to avoid a Dutch book but not necessary.[9] Jeffrey advocated this as a rule of updating under radical probabilism and called it probability kinematics. Others have named it Jeffrey conditioning.
It is not the only sufficient updating rule for radical probabilism. Others have been advocated including E. T. Jaynes' maximum entropy principle and Brian Skyrms' principle of reflection.
Jeffrey conditioning can be generalized from partitions to arbitrary condition events by giving it a frequentist semantics.[10]
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.