Loading AI tools
American mathematician From Wikipedia, the free encyclopedia
Donald Jay Geman (born September 20, 1943) is an American applied mathematician and a leading researcher in the field of machine learning and pattern recognition. He and his brother, Stuart Geman, are very well known for proposing the Gibbs sampler and for the first proof of the convergence of the simulated annealing algorithm,[1] in an article that became a highly cited reference in engineering (over 21K citations according to Google Scholar, as of January 2018).[2] He is a professor at the Johns Hopkins University and simultaneously a visiting professor at École Normale Supérieure de Cachan.
Donald J. Geman | |
---|---|
Born | |
Alma mater | Columbia University University of Illinois Urbana-Champaign Northwestern University |
Relatives | Stuart Geman (brother) |
Awards | ISI highly cited researcher |
Scientific career | |
Fields | Mathematics Statistics |
Institutions | University of Massachusetts Johns Hopkins University École Normale Supérieure de Cachan |
Doctoral advisor | Michael Marcus |
Geman was born in Chicago in 1943. He graduated from the University of Illinois Urbana-Champaign in 1965 with a B.A. degree in English Literature and from Northwestern University in 1970 with a Ph.D. in mathematics.[3] His dissertation was entitled as "Horizontal-window conditioning and the zeros of stationary processes." He joined University of Massachusetts - Amherst in 1970, where he retired as a distinguished professor in 2001. Thereafter, he became a professor at the Department of Applied Mathematics at Johns Hopkins University. He has also been a visiting professor at the École Normale Supérieure de Cachan since 2001. He is a member of the National Academy of Sciences, and Fellow of the Institute of Mathematical Statistics and the Society for Industrial and Applied Mathematics.
D. Geman and J. Horowitz published a series of papers during the late 1970s on local times and occupation densities of stochastic processes. A survey of this work and other related problems can be found in the Annals of Probability.[4] In 1984 with his brother Stuart, he published a milestone paper which is still today one of the most cited papers[5] in the engineering literature. It introduces a Bayesian paradigm using Markov Random Fields for the analysis of images. This approach has been highly influential over the last 20 years and remains a rare tour de force in this rapidly evolving field. In another milestone paper,[6][7] in collaboration with Y. Amit, he introduced the notion for randomized decision trees,[8][9] which have been called random forests and popularized by Leo Breiman. Some of his recent works include the introduction of coarse-to-fine hierarchical cascades for object detection[10] in computer vision and the TSP (Top Scoring Pairs) classifier as a simple and robust rule for classifiers trained on high dimensional small sample datasets in bioinformatics.[11][12]
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.