Haoran Zhao
I am a master's student in Computational Linguistics at the University of Washington.
Previously, I was an undergraduate student at Drexel University, where I was fortunately advised by Jake Williams. Before Drexel, I spent two amazing years at Lanzhou University in China, where I very much enjoyed the first two years of my college life.
I also spent some amazing time at the Computation & Cognition Lab and Causality in Cognition Lab at Stanford, where I worked with Noah Goodman and Tobias Gerstenberg and got to know a lot of amazing people who shaped my current research focus.
I am interested in social cognition, language and AI. At the moment, I am specifically thinking about Theory of Mind and Pragmatics, trying to study our ability to infer others' mental states and predict others' actions in the context of language and communication, and how well LLMs can do it, if at all. I am currently working with Max Kleiman-Weiner and Robert Hawkins to explore these topics.
Email /
CV /
Google Scholar /
Twitter (X) /
Github
|
 |
News
- April 2025: Two paper got accepted to CogSci 2025
- April 2025: Got into the CBMM summer school at MIT
|
Research
|
Polite Speech Generation in Humans and Language Models
Haoran Zhao, Robert Hakwins
CogSci, 2025
|
Non-literal Understanding of Number Words by Language Models
Polina Tsvilodub*, Kanishk Gandhi*, Haoran Zhao*, Jan-Philipp Franken, Michael Franke, Noah Goodman
CogSci, 2025
arXiv
|
Large Language Models are Not Inverse Thinkers Quite yet
Haoran Zhao
ICML Workshop on LLMs and Cognition, 2024
Paper Link
|
Bit Cipher -- A Simple yet Powerful Word Representation System
Haoran Zhao, Jake Ryland Williams
arXiv, 2023
arXiv
|
Explicit Foundation Model Optimization with Self-Attentive Feed-Forward Neural Units
Jake Ryland Williams, Haoran Zhao
arXiv, 2023
arXiv
|
Reducing the Need for Backpropagation and Discovering Better Optima With Explicit Optimizations of Neural Networks
Jake Ryland Williams, Haoran Zhao
arXiv, 2023
arXiv
|
Optimizing Named Entity Recognition for Improving Logical Formulae Abstraction from Technical Requirements Documents.
Alexander Perko,Haoran Zhao, Franz Wotawa
The 10th International Conference on Dependable Systems and Their Applications (DSA-2023), 2023
|
Prompt Design and Answer Processing for Knowledge Base Construction from Pre-trained Language Models (KBC-LM)
Xiao Fang, Alex Kalinowski, Haoran Zhao, Ziao You, Yuhao Zhang, Yuan An
Challenge @ 21st International Semantic Web Conference (ISWC 2022) CEUR Workshop Proceedings, 2022
|
Last Updated: April 28th 2025
Adapted from: GitHub
|
|