Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain
the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in
Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles
and JavaScript.
Shyamal D. Peddada is a senior investigator and deputy chief of the Biostatistics and Computational Biology Branch, National Institute of Environmental Health Sciences, Research Triangle Park, North Carolina, USA.
Ravindra Khattree is distinguished professor of applied statistics in the Department of Mathematics and Statistics, Oakland University, Rochester, Michigan, USA.
Calyampudi Radhakrishna Rao pioneered powerful statistical methods that underpin modern scientific data analyses. His ‘information geometry’ and other data-reduction techniques enable information to be extracted from data sets with many variables, so it can be more easily visualized or classified into groups. Rao, who published seminal papers while he was just in his twenties, has died aged 102.
Rao held that the field of statistics should be driven by applications of it. The building blocks of statistical science that he contributed have had a huge impact in agricultural sciences, biomedical research, econometrics, industrial engineering, social sciences and signal processing. For example, his methods were used to uncover signs of the Higgs boson in data from CERN — Europe’s particle-physics laboratory near Geneva, Switzerland.
Rao was born in Hadagalli, India. His father, a police inspector, had to move frequently for postings until the family settled in Visakhapatnam, India. Both parents noticed Rao’s precocious mathematical ability and encouraged him to do bachelor’s and master’s degrees in mathematics at Andhra University in Visakhapatnam.
In 1941, Rao joined the Indian Statistical Institute (ISI) in Kolkata, founded and led by eminent statistician P. C. Mahalanobis. The basic principles of statistical inference were still being established when Rao started a master’s course in statistics at the University of Calcutta in what is now Kolkata in the same year. The roots of many data-analysis tools can be traced back to Rao’s master’s and PhD theses, and papers he published in the 1940s. For example, in his master’s thesis (1943), through analysing data from populations living in the Indian state of Uttar Pradesh, he developed the ‘perimeter’ test for comparing two or more experimental groups. This led to the multivariate analysis of variance (MANOVA) procedure for comparing sample means.
In 1943, in response to a student’s question, Rao derived a lower bound (now known as the Cramér–Rao inequality) for estimating unknown parameters of a statistical population, such as the average reduction in people’s blood pressure caused by a new drug. He produced an algorithm (now called the Rao–Blackwell theorem) to derive rules for making such estimations. For example, reductions in blood-pressure measurements in a small random group can provide a good estimate of those of the whole population. Using differential geometry for the first time in statistics literature, Rao also defined a metric for the dissimilarity between two probability distributions (the Fisher–Rao distance). His paper describing these results gave rise to the term information geometry (C. R. Rao Bull. Calcutta Math. Soc.37, 81–91; 1945).
In 1946, Rao was invited to work at the Museum of Archaeology and Ethnology of the University of Cambridge, UK. He analysed measures of the shapes of skulls excavated from ancient graves dating back several thousand years in Sudan, categorizing the remains according to their tribes and ages. At the same time, he began his PhD studies under the founding father of modern statistics, Ronald Fisher, a friend and collaborator of Mahalanobis. Rao’s work included mapping mouse chromosomes for studies of genetic linkages. His PhD thesis and subsequent papers led to statistical methodologies that provided the foundations of modern data analysis, such as discrimination and classification theories, MANOVA and more.