kzhao.hf [at] gmail.com
Research Scientist
@Google Inc.
I currently work at Google as a research scientist.
I received my Ph.D. degree from School of Electrical Engineering and Computer Science, Oregon State University.
My research is focused on algorithms and theory in Natural Language Processing (NLP), especially in structured prediction problems including syntactic/semantic parsing, machine translation, and textual entailment. My advisor is Prof. Liang Huang.
Prior to Oregon State University, I started my Ph.D. study at Computer Science Department, Graduate Center, City University of New York (CUNY). I received my Bachelor of Engineering (B.Eng.) degree from Computer Science Department, University of Science and Technology of China (USTC) in 2010.
My publication list is also available at Google Scholar.
Liang Huang, He Zhang, Dezhong Deng, Kai Zhao, Kaibo Liu, David Hendrix, and David Mathews. LinearFold: linear-time approximate RNA folding by 5’-to-3’ dynamic programming and beam search. In Proceedings of ISMB 2019.
Kai Zhao and Liang Huang. Joint Syntacto-Discourse Parsing and the Syntacto-Discourse Treebank. In Proceedings of EMNLP 2017 (arXiv, code).
Liang Huang, Kai Zhao, and Mingbo Ma. When to Finish? Optimal Beam Search for Neural Text Generation (modulo beam size). In Proceedings of EMNLP 2017.
Mingbo Ma, Kai Zhao, Liang Huang, Bing Xiang, and Bowen Zhou, Jointly Trained Sequential Labeling and Classification by Sparse Attention Neural Networks. In Proceedings of Interspeech 2017 (arXiv).
Kai Zhao, Liang Huang, and Mingbo Ma. Textual Entailment with Structured Attentions and Composition. In Proceedings of COLING 2016 (arXiv, code, slides).
Feifei Zhai, Liang Huang, and Kai Zhao. Search-aware Tuning for Hierarchical Phrase-based Decoding. In Proceedings of EMNLP 2015.
Kai Zhao, Hany Hassan, and Michael Auli. Learning Translation Models from Monolingual Continuous Representations. In Proceedings of NAACL 2015 (slides).
Kai Zhao and Liang Huang. Type-driven Incremental Semantic Parsing with Polymorphism. In Proceedings of NAACL 2015 (code, slides).
Kai Zhao, Liang Huang, Haitao Mi, and Abe Ittycheriah. Hierarchical MT Training using Max-Violation Perceptron. In Proceedings of ACL 2014 (slides).
Kai Zhao, James Cross, and Liang Huang. Optimal Incremental Parsing via Best-First Dynamic Programming. In Proceedings of EMNLP 2013.
Heng Yu, Liang Huang, Haitao Mi, and Kai Zhao. Max-Violation Perceptron and Forced Decoding for Scalable MT Training. In Proceedings of EMNLP 2013.
Hao Zhang, Liang Huang, Kai Zhao, and Ryan McDonald. Online Learning with Inexact Hypergraph Search. In Proceedings of EMNLP 2013.
Yoav Goldberg, Kai Zhao, and Liang Huang. Efficient Implementation for Beam Search Incremental Parsers. In Proceedings of ACL 2013.
Kai Zhao and Liang Huang. Minibatch and Parallelization for Online Large Margin Structured Learning. In Proceedings of NAACL 2013.
We have a tutorial on Structured Prediction with Perceptron presented in ACL 2014 (slides), and MASC-SLL 2015 (revised slides).
I am cleaning up and releasing my old codes for some of the papers listed above. Hopefully these can help people doing related researches. I usually write in Python, sometimes in C/C++ for efficiency reasons. My experience with C# at Microsoft is fascinating. (LINQ and parallelization are my favorites.) I am looking forword to seeing it become more popular. For deep learning I mostly write in Lua with Torch.
Textual Entailment with Structured Attentions and Composition
Optimal Incremental Dependency Parser with Best-First Dynamic Programming: to be ready soon.