Christopher D. Manning

Christopher D. Manning
Manning in 2025
Born (1965-09-18) September 18, 1965
Academic background
Alma materAustralian National University (BA (Hons))
Stanford University (PhD)
ThesisErgativity: Argument Structure and Grammatical Relations (1994)
Academic work
InstitutionsCarnegie Mellon University
University of Sydney
Stanford University
Doctoral studentsDan Klein
Sepandar Kamvar
Danqi Chen

Christopher David Manning (born September 18, 1965) is an Australian-American computer scientist and applied linguist specializing in the areas of natural language processing, artificial intelligence and machine learning. Manning is the Thomas M. Siebel Professor in Machine Learning and a professor of Linguistics and Computer Science at Stanford University, and was the director of the Stanford Artificial Intelligence Laboratory (SAIL) from 2018 to 2025.[1]

Manning has been described as “the leading researcher in natural language processing”,[2] well known for co-developing GloVe word vectors; the bilinear or multiplicative form of attention, now widely used in artificial neural networks including the transformer; tree-structured recursive neural networks; and approaches to and systems for textual entailment. His main educational contributions are his textbooks Foundations of Statistical Natural Language Processing (1999) and Introduction to Information Retrieval (2008), and his course CS224N Natural Language Processing with Deep Learning, which is available online. From 2002, Manning also pioneered the development of well-maintained open source computational linguistics software packages, including CoreNLP, Stanza, and GloVe.[3][4][5][6]

Manning received a BA (Hons) degree majoring in mathematics, computer science, and linguistics from the Australian National University (1989) and a PhD in linguistics from Stanford (1994), under the guidance of Joan Bresnan.[7][8] He was an assistant professor at Carnegie Mellon University (1994–96) and a lecturer at the University of Sydney (1996–99) before returning to Stanford as an assistant professor. At Stanford, he was promoted to associate professor in 2006 and to full professor in 2012. He was elected an AAAI Fellow in 2010. an inaugural ACL Fellow in 2011, and an ACM Fellow in 2013.[9][10][11] He was previously President of the Association for Computational Linguistics (2015) and he has received an honorary doctorate from the University of Amsterdam (2023). Manning was awarded the IEEE John von Neumann Medal “for advances in computational representation and analysis of natural language” in 2024 and elected as a Fellow of the National Academy of Engineering and the American Academy of Arts and Sciences in 2025.[12][2][13][14]

Manning's linguistic work includes his dissertation Ergativity: Argument Structure and Grammatical Relations (1994), a monograph Complex Predicates and Information Spreading in LFG (1999),[15] and his work developing Universal Dependencies,[16] from which he is the namesake of Manning's Law.

Manning's PhD students include Dan Klein, Sepandar Kamvar, Richard Socher, and Danqi Chen.[8] In 2021, he joined AIX Ventures[17] as an Investing Partner. AIX Ventures is a venture capital fund that invests in artificial intelligence startups.

Bibliography

  • Christopher D. Manning; Hinrich Schütze (1999). Foundations of Statistical Natural Language Processing. Cambridge: Massachusetts Institute of Technology. ISBN 0-262-13360-1. OL 35843M. Wikidata Q115664565.
  • Christopher D. Manning; Prabhakar Raghavan; Hinrich Schutze (2008). Introduction to Information Retrieval. doi:10.1017/CBO9780511809071. ISBN 978-0-511-80907-1. OL 34476084M. Zbl 1160.68008. Wikidata Q60673995.

References

  1. ^ "Christopher Manning's Profile". Stanford Profiles. Stanford University. Retrieved 29 December 2025.
  2. ^ a b "Christopher D. Manning". IEEE. Retrieved 27 October 2024.
  3. ^ "Christopher D Manning - AD Scientific Index 2022". www.adscientificindex.com. Retrieved 22 February 2022.
  4. ^ "Christopher Manning". CIFAR. Retrieved 22 February 2022.
  5. ^ "Laying the foundation for today's generative AI". Stanford. 18 April 2024. Retrieved 27 October 2024.
  6. ^ "Stanford NLP Group". Retrieved 23 April 2023.
  7. ^ Manning, Christopher. "Christopher Manning". The Stanford Natural Language Processing Group. Retrieved 24 May 2022.
  8. ^ a b Manning, Christopher. "Christopher Manning and Ph.D. Students' Dissertations". The Stanford Natural Language Processing Group. Retrieved 24 May 2022.
  9. ^ "Elected AAAI Fellows". AAAI. Retrieved 6 January 2024.
  10. ^ "ACL Fellows". ACL. Retrieved 22 February 2026.
  11. ^ "ACM Fellows". ACM. Retrieved 22 February 2026.
  12. ^ "UvA honorary doctorates for psychiatrist Vikram Patel and computer scientist Christopher Manning". December 2022. Retrieved 23 April 2023.
  13. ^ "Stanford faculty elected to the National Academy of Engineering". 18 February 2025. Retrieved 22 February 2026.
  14. ^ "Seven Stanford faculty elected to the American Academy of Arts and Sciences". 23 April 2025. Retrieved 22 February 2026.
  15. ^ "Complex Predicates and Information Spreading in LFG". Retrieved 23 April 2023.
  16. ^ de Marneffe, Marie-Catherine; Manning, Christopher D.; Nivre, Joakim; Zeman, Daniel (13 July 2021). "Universal Dependencies". Computational Linguistics. 47 (2): 255–308. doi:10.1162/coli_a_00402. hdl:2078.1/278798. ISSN 0891-2017. S2CID 219304854.
  17. ^ "AIX Ventures - An AI Fund". AIX Ventures. Retrieved 13 January 2023.